2018-10-22         Penelope

Bigquery Error: 8822097

On trying to load a json file to bigquery. I get the following error: "An internal error occurred and the request could not be completed. Error: 8822097". Is this an error related to hitting the bigquery daily load limit? It will be amazing if someone can point me to a glossary of errors.{Location: ""; Message: "An internal error occurred and the request could not be completed. Error: 8822097"; Reason: "internalError"Thanks! Are you trying to load different types of file in a single command?It may happen when you try to load from a Google Storage path with both compresse...

 google-bigquery                     1 answers                     39 view
 2018-10-22         Pamela

Is there a BigQuery version of isnumeric

I need to test if a field is numeric or not using standard SQL in BigQuery. The example below works and is similar to what I have done in Cognos using TRANSLATE('mystring','1234567890.','') but its not very elegant.SELECTIF(LENGTH(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE(REPLACE('1234.56','1',''),'2',''),'3',''),'4',''),'5',''),'6',''),'7',''),'8',''),'9',''),'0',''),'.',''))=0,'A number','Not a number') You can use SAFE_CAST to try casting to a number. SAFE_CAST casts similar to CAST, but if casting fails, instead of erring null is...

 google-bigquery                     3 answers                     42 view
 2018-10-22         Tammy

Make existing bigquery table clustered

I have a quite huge existing partitioned table in bigquery. I want to make the table clustered, at least for the new partition.From the documentation: https://cloud.google.com/bigquery/docs/creating-clustered-tables, it is said that we are able to Creating a clustered table when you load data and I have tried to load a new partition using clustering fields: job_config.clustering_fields = ["event_type"].The load finished successfully, however it seems that the new partition is not clustered (I am not really sure how to check whether it is clustered or not, but when I query t...

 google-bigquery                     1 answers                     42 view
 2018-10-22         Bennett

Using CLI for data load into column type partitioning: Incompatible table partitioning specification

At a simple BQ load some CSVs to a new schema using this cmd we get the below error:bq load --time_partitioning_field saved_timestamp--skip_leading_rows=1 --max_bad_records=100 --allow_jagged_rows --replace --source_format=CSV --ignore_unknown_values TABLE gs://.../export*.gz schema.json Incompatible table partitioning specification. Expects partitioning specification none, but input partitioning specification is interval(type:day,field:saved_timestamp)My expectation would be to create a column type partitioning column. What's wrong?Also can we use the same syntax to sp...

 google-bigquery                     1 answers                     43 view
 2018-10-22         Oswald

How to convert Array of columns to rows in Bigquery

I have a column as below screenshot in my google bigqueryi need to convert that column to rows as below in Bigquery :7004270055700447004670042700557004470046Please suggest me how can i get the rows like above. Below examples for BigQuery Standard SQLFirst is applicable if your column is an array of string and second in case if it is a string that looks like array :o) #standardSQLWITH `project.dataset.table` AS ( SELECT 1 id, ['70042', '70055', '70044', '70046'] dspid UNION ALL SELECT 2 id, ['70042', '70055', '70044', '70046'] dspid )SELECT id, dspidFROM `project.data...

 google-bigquery                     1 answers                     45 view
 2018-10-22         Miles

How can I insert a field from storage into my SQL query in BigQuery

I have made an SQL statement like the following example:SELECT ipFROM ip_tableLIMIT 500Then I saved the result into a google storage table as a csv format. Now I found that I want more data about the ips I queries previously. Can I read the ips that I saved in the previous query and use them into a new query like this:SELECT mroe_infoFROM ip_tableWHERE ip = ip_from_my_csv_fileWhere ip_from_my_csv_file should iterate over the ips I have in my csv file.Can you help me achieve this? You can create external table (for example named my_csv_file) on top of your csv file (see U...

 google-bigquery                     1 answers                     45 view
 2018-10-22         Veronica

Why does my Google BigQuery query took so long?

It took over 18 minutes to run the following query with our test dataset:SELECT count(distinct S1.visitorId, 50000) as returningVisitors, STRFTIME_UTC_USEC(UTC_USEC_TO_DAY(PARSE_UTC_USEC(S1.timeStamp)), '%Y-%m-%d') AS day,S1.dimension1, S1.dimension2FROM [myDataset.MyTable] as S1 JOIN EACH [myDataset.MyTable] as S2 on S1.visitorId= S2.visitorIdWHERE UTC_USEC_TO_DAY(PARSE_UTC_USEC(S1.timeStamp)) < UTC_USEC_TO_DAY(NOW()) andS2.timeStamp < STRFTIME_UTC_USEC(UTC_USEC_TO_DAY(PARSE_UTC_USEC(S1.timeStamp)), '%Y-%m-%d') GROUP EACH BY S1.dimension1, S1.dimension2, day ORDER BY...

 google-bigquery                     1 answers                     46 view
 2018-10-22         Martin

BigQuery Maximum name length

What are the maximum lengths for identifiers in BigQuery (names for projects, data sets, tables, columns)?The documentation just says "string" wherever these identifiers are referenced, but I can't find any indication of maximum sizes for these.Thanks Maximum length for table, dataset, and job ids are 1024 chars.Maximum length for field names is 128 chars.I don't know what the maximum length for a project name is, however. [XXX]

 google-bigquery                     1 answers                     48 view
 2018-10-22         Hamiltion

Big Query table too fragmented - unable to rectify

I have got a Google Big Query table that is too fragmented, meaning that it is unusable. Apparently there is supposed to be a job running to fix this, but it doesn't seem to have stopped the issue for myself.I have attempted to fix this myself, with no success.Steps tried:Copying the table and deleting original - this does not work as the table is too fragmented for the copyExporting the file and reimporting. I managed to export to google cloud storage, as the file was JSON, so couldn't download - this was fine. The problem was on re-import. I was trying to use the web int...

 google-bigquery                     1 answers                     47 view
 2018-10-22         Norman

Google BigQuery: Are we charged when resources exceed during query execution?

I wonder if we are charged when the query failed and we get the error message:"Error: Resources exceeded during query execution." No. You're only charged for successful queries. [XXX]

 google-bigquery                     1 answers                     51 view
 2018-10-22         Allen

Why can't I reference scoped aggregates in the WHERE or HAVING clauses?

I noticed that BQL won't allow me to reference scoped aggregates in either the WHERE or HAVING clauses. For example:% bq query 'SELECT fullName, COUNT(children.name) WITHIN RECORD as numChildren FROM [persons.person]' +---------------+-------------+| fullName | numChildren |+---------------+-------------+| John Doe | 2 || Mike Jones | 3 || Anna Karenina | 0 |+---------------+-------------+% bq query 'SELECT fullName, COUNT(children.name) WITHIN RECORD as numChildren FROM [persons.person] WHERE numChildren > 0' BigQuery error i...

 google-bigquery                     2 answers                     51 view
 2018-10-22         Ingemar

Is there a function to get the max of two values in Google BigQuery?

I want to get the maximum value of 2 Integer (or 2 float).I know I can do it with a IF function like this: IF (column1 > column2, column1, column2)however I was wondering if a function to do that exists or if there is a plan to add that kind of function in the future.In MySQL there is the GREATER function that can do that. Example: GREATER(column1, column2). BigQuery supports GREATEST(expr1, expr2, ...) which returns the largest argument. I've filed an internal bug to get this added to our public documentation. [XXX]There isn't currently a function to return the g...

 google-bigquery                     2 answers                     57 view
 2018-10-22         Christ

Dataflow insert into BigQuery fails with large number of files for asia-northeast1 location

I am using Cloud Storage Text to BigQuery template on Cloud Composer.The template is kicked from Python google api client.The same program works fine in US location (for Dataflow and BigQuery).fails in asia-northeast1 location.works fine with the fewer (less than 10000) input files in asia-northeast location.Does anybody have an idea about this?I want to execute in the asia-northeast location for business reason.More details about failure:The program worked until "ReifyRenameInput", and the failed .dataflow job failedwith the error message below:java.io.IOException: Unable...

 google-bigquery                     1 answers                     60 view
 2018-10-22         Robin

Google Cloud Dataflow Pub/Sub to BigQuery template WriteSuccessfulRecords wall time

Currently getting astronomical wall time with standard pubsub_to_bigquery template. Only parsing in about 10 keys. WriteSuccessfulRecords is showing over 11 hours! When I break this out, I see that StreamingWrite is the culprit, however I can see the data immediately in BigQuery.Is this just a buffering issue (i.e. keeping the buffer available/open for extended periods) or should I be concerned? In streaming mode the wall time of the step will accumulate forever since the input is unbounded. The reason you're seeing such a high wall time is because the pipeline has been ...

 google-bigquery                     1 answers                     61 view
 2018-10-22         Egbert

infer avro schema for BigQuery table load

I'm using the java api, trying to load data from avro files into BigQuery.When creating external tables, BigQuery automatically detects the schema from the .avro files.Is there a way to specify a schema/data file in GCS when creating a regular BigQuery table for data to be loaded into?thank you in advance You could create manually the schema definition with the configuration.load.schema, however, the documentation says that: When you load Avro, Parquet, ORC, Cloud Firestore export data, or Cloud Datastore export data, BigQuery infers the schema from the source data. ...

 google-bigquery                     2 answers                     62 view
 2018-10-22         Elsie

create BigQuery external tables partitioned by one/multiple columns

I am porting a java application from Hadoop/Hive to Google Cloud/BigQuery. The application writes avro files to hdfs and then creates Hive external tables with one/multiple partitions on top of the files.I understand Big Query only supports date/timestamp partitions for now, and no nested partitions.The way we now handle hive is that we generate the ddl and then execute it with a rest call.I could not find support for CREATE EXTERNAL TABLE in the BigQuery DDL docs, so I've switched to using the java library.I managed to create an external table, but I cannot find any refere...

 google-bigquery                     1 answers                     64 view

Page 1 of 40  |  Show More Pages:  Top Prev Next Last