Setting table expiration time using Dataflow BigQuery sink -


is there way set expiration time on bigquery table when using dataflow's bigqueryio.write sink?

for example, i'd (see last line):

pcollection<tablerow> mainresults... mainresults.apply(bigqueryio.write                 .named("my-bq-table")                 .to("project:dataset.table")                 .withschema(getbigquerytableschema())                 .withwritedisposition(bigqueryio.write.writedisposition.write_truncate)                 .withcreatedisposition(bigqueryio.write.createdisposition.create_if_needed))                 .withexpiration(1452030098l) //**this table should expire on 31st jan 

i can't see in dataflow api facilitate this. of course, use bigquery api, better able in via dataflow when specifying sink.

this isn't supported in dataflow api. can @ adding soon, should straightforward addition.


Comments

Popular posts from this blog

how to insert data php javascript mysql with multiple array session 2 -

multithreading - Exception in Application constructor -

windows - CertCreateCertificateContext returns CRYPT_E_ASN1_BADTAG / 8009310b -