Setting table expiration time using Dataflow BigQuery sink -
is there way set expiration time on bigquery table when using dataflow's bigqueryio.write sink?
for example, i'd (see last line):
pcollection<tablerow> mainresults... mainresults.apply(bigqueryio.write .named("my-bq-table") .to("project:dataset.table") .withschema(getbigquerytableschema()) .withwritedisposition(bigqueryio.write.writedisposition.write_truncate) .withcreatedisposition(bigqueryio.write.createdisposition.create_if_needed)) .withexpiration(1452030098l) //**this table should expire on 31st jan i can't see in dataflow api facilitate this. of course, use bigquery api, better able in via dataflow when specifying sink.
this isn't supported in dataflow api. can @ adding soon, should straightforward addition.
Comments
Post a Comment