Max value of multiple columns in Lakehouse I’m looking for a TimeXtender Fabric Lakehouse (pyspark.sql) expert 😀 I’ve got a SQL transformation that selects the maximum value of 10 columns: (select max(val) from (values ([Column_1]), ([Column_2]), ..., ([Column_10])) as [Values](val)) How do I achieve this in a Prepare Lakehouse custom transformation?I could build the mother/father of all massive case statements, but I’d prefer something simpler and more elegant … if possible!?