Exporting String data from Bigquery to GCS (CSV) without double quotes
NickName:passionate Ask DateTime:2019-10-27T18:09:34

Exporting String data from Bigquery to GCS (CSV) without double quotes

I have a concatenated string in a Bigquery table which has double quotes enclosed while exporting data as CSV to GCS. Is there a way I can avoid double quotes in the files?

Copyright Notice:Content Author:「passionate」,Reproduced under the CC 4.0 BY-SA copyright license with a link to the original source and this disclaimer.
Link to original article:https://stackoverflow.com/questions/58578392/exporting-string-data-from-bigquery-to-gcs-csv-without-double-quotes

Answers
Aragorn 2019-11-06T12:21:36

The reason the final csv has quotes is because the string contains ',' which is also the default delimiter. Specifying a different delimiter other than ',' should work.\nbq extract --format none --noprint_header --field_delimiter \"|\" [table_name] [gcs_file_location]",


More about “Exporting String data from Bigquery to GCS (CSV) without double quotes” related questions

Exporting String data from Bigquery to GCS (CSV) without double quotes

I have a concatenated string in a Bigquery table which has double quotes enclosed while exporting data as CSV to GCS. Is there a way I can avoid double quotes in the files?

Show Detail

Custom Delimiters in CSV Data while exporting data from BigQuery to GCS bucket?

Background: I have a GA Premium account. I currently have the following process setup: The raw data from the GA account flows into BigQuery. Query the Bigquery tables. Export query results to a ...

Show Detail

Exporting data from BigQuery to GCS - Partial transfer possible?

I am currently exporting my data (from a destination table in Bigquery) to a bucket in GCS. Doing this programmatically using the Bigquery API. There is a constraint while exporting data from Big...

Show Detail

Exporting data to GCS from BigQuery - Split file size control

I am currently exporting data from Bigquery to GCS buckets. I am doing this programmatically using the following query: query_request = bigquery_service.jobs() DATASET_NAME = "#######"; PROJECT_I...

Show Detail

how to limit the size of the file that exporting from bigquery to gcs?

I Used the python code for exporting data from bigquery to gcs,and then using gsutil to export to s3!But after exporting to gcs ,I noticed the some files are more tha 5 GB,which gsutil cannnot deal...

Show Detail

Bigquery table to GCS

I am trying to automate a job where files are written in gcs using data from queried data in BQ. I have a bigquery table and I need to export files out to GCS, named according to a particular field.

Show Detail

Exporting CSV file is not woking with double quotes value

I have a gridview, when I'm exporting this grid to csv format if the table contains double quotes the value data are not shown after the double quotes.(export contains value upto double but not after

Show Detail

Problem in reading string NULL values from BigQuery

Currently I am using spark to read data from bigqiery tables and write it to storage bucket as csv. One issue that i am facing is that the null string values are not being read properly by spark fr...

Show Detail

Migrate csv from gcs to postgresql

I'm trying to migrate csv files from Google Cloud Storage (GCS), which have been exported from BigQuery, to a PostgreSQL Google cloud sql instance using a python script. I was hoping to use the Go...

Show Detail

Double quote handling when exporting JSON field with BigQuery

I am making use of the JSON datatype in BigQuery and I have a table that looks like this: myStringField | myJSONField ----------------|---------------------------------- someStringValue | {"...

Show Detail