https://delta.io logo
v

vijay

02/09/2023, 10:15 PM
Hi All, Is it a good practice to have side effects in the delta live table pipeline? I would like to write data into Postgres DB as a last step.
j

JosephK (exDatabricks)

02/09/2023, 10:33 PM
Could you put the DLT pipeline into a workflow and then have a notebook in the workflow that does the write?
v

vijay

02/09/2023, 10:35 PM
This would be continuously running DLT. Will I be able to integrate that with the regular workflow
j

JosephK (exDatabricks)

02/09/2023, 10:38 PM
Hmm, I think continuous won’t work because it never finishes and wouldn’t trigger the second part of the workflow. I think in the python API you can have a write to jdbc as a sink.
v

vijay

02/09/2023, 11:12 PM
Thank you!
Here it says - However, you can include these functions outside of
table
or
view
function definitions because this code is run once during the graph initialization phase.
It says it will run only once during the initiliazation phase. I want to run this every time I process the stream data.
Can I include the sink inside table definition? Something like this
Copy code
@dlt.table
def table():
    df = read stream data
    df.save -- to db
    return df
j

JosephK (exDatabricks)

02/09/2023, 11:36 PM
I think it has to be outside, but I’m not 100% sure. DLT is too new!
5 Views