I’m looking for a TimeXtender Fabric Lakehouse (pyspark.sql) expert
I’ve got a SQL transformation that selects the maximum value of 10 columns:
(select max(val) from (values (vColumn_1]), (nColumn_2]), ..., (,Column_10])) as 0Values](val))
How do I achieve this in a Prepare Lakehouse custom transformation?
I could build the mother/father of all massive case statements, but I’d prefer something simpler and more elegant … if possible!?