== Physical Plan ==
TakeOrderedAndProject (26)
+- * Filter (25)
   +- * HashAggregate (24)
      +- * ColumnarToRow (23)
         +- CometColumnarExchange (22)
            +- RowToColumnar (21)
               +- * HashAggregate (20)
                  +- * ColumnarToRow (19)
                     +- CometProject (18)
                        +- CometBroadcastHashJoin (17)
                           :- CometProject (13)
                           :  +- CometBroadcastHashJoin (12)
                           :     :- CometProject (7)
                           :     :  +- CometBroadcastHashJoin (6)
                           :     :     :- CometFilter (2)
                           :     :     :  +- CometScan parquet spark_catalog.default.inventory (1)
                           :     :     +- CometBroadcastExchange (5)
                           :     :        +- CometFilter (4)
                           :     :           +- CometScan parquet spark_catalog.default.warehouse (3)
                           :     +- CometBroadcastExchange (11)
                           :        +- CometProject (10)
                           :           +- CometFilter (9)
                           :              +- CometScan parquet spark_catalog.default.item (8)
                           +- CometBroadcastExchange (16)
                              +- CometFilter (15)
                                 +- CometScan parquet spark_catalog.default.date_dim (14)


(1) Scan parquet spark_catalog.default.inventory
Output [4]: [inv_item_sk#1, inv_warehouse_sk#2, inv_quantity_on_hand#3, inv_date_sk#4]
Batched: true
Location: InMemoryFileIndex []
PartitionFilters: [isnotnull(inv_date_sk#4), dynamicpruningexpression(inv_date_sk#4 IN dynamicpruning#5)]
PushedFilters: [IsNotNull(inv_warehouse_sk), IsNotNull(inv_item_sk)]
ReadSchema: struct<inv_item_sk:int,inv_warehouse_sk:int,inv_quantity_on_hand:int>

(2) CometFilter
Input [4]: [inv_item_sk#1, inv_warehouse_sk#2, inv_quantity_on_hand#3, inv_date_sk#4]
Condition : (isnotnull(inv_warehouse_sk#2) AND isnotnull(inv_item_sk#1))

(3) Scan parquet spark_catalog.default.warehouse
Output [2]: [w_warehouse_sk#6, w_warehouse_name#7]
Batched: true
Location [not included in comparison]/{warehouse_dir}/warehouse]
PushedFilters: [IsNotNull(w_warehouse_sk)]
ReadSchema: struct<w_warehouse_sk:int,w_warehouse_name:string>

(4) CometFilter
Input [2]: [w_warehouse_sk#6, w_warehouse_name#7]
Condition : isnotnull(w_warehouse_sk#6)

(5) CometBroadcastExchange
Input [2]: [w_warehouse_sk#6, w_warehouse_name#7]
Arguments: [w_warehouse_sk#6, w_warehouse_name#7]

(6) CometBroadcastHashJoin
Left output [4]: [inv_item_sk#1, inv_warehouse_sk#2, inv_quantity_on_hand#3, inv_date_sk#4]
Right output [2]: [w_warehouse_sk#6, w_warehouse_name#7]
Arguments: [inv_warehouse_sk#2], [w_warehouse_sk#6], Inner, BuildRight

(7) CometProject
Input [6]: [inv_item_sk#1, inv_warehouse_sk#2, inv_quantity_on_hand#3, inv_date_sk#4, w_warehouse_sk#6, w_warehouse_name#7]
Arguments: [inv_item_sk#1, inv_quantity_on_hand#3, inv_date_sk#4, w_warehouse_name#7], [inv_item_sk#1, inv_quantity_on_hand#3, inv_date_sk#4, w_warehouse_name#7]

(8) Scan parquet spark_catalog.default.item
Output [3]: [i_item_sk#8, i_item_id#9, i_current_price#10]
Batched: true
Location [not included in comparison]/{warehouse_dir}/item]
PushedFilters: [IsNotNull(i_current_price), GreaterThanOrEqual(i_current_price,0.99), LessThanOrEqual(i_current_price,1.49), IsNotNull(i_item_sk)]
ReadSchema: struct<i_item_sk:int,i_item_id:string,i_current_price:decimal(7,2)>

(9) CometFilter
Input [3]: [i_item_sk#8, i_item_id#9, i_current_price#10]
Condition : (((isnotnull(i_current_price#10) AND (i_current_price#10 >= 0.99)) AND (i_current_price#10 <= 1.49)) AND isnotnull(i_item_sk#8))

(10) CometProject
Input [3]: [i_item_sk#8, i_item_id#9, i_current_price#10]
Arguments: [i_item_sk#8, i_item_id#9], [i_item_sk#8, i_item_id#9]

(11) CometBroadcastExchange
Input [2]: [i_item_sk#8, i_item_id#9]
Arguments: [i_item_sk#8, i_item_id#9]

(12) CometBroadcastHashJoin
Left output [4]: [inv_item_sk#1, inv_quantity_on_hand#3, inv_date_sk#4, w_warehouse_name#7]
Right output [2]: [i_item_sk#8, i_item_id#9]
Arguments: [inv_item_sk#1], [i_item_sk#8], Inner, BuildRight

(13) CometProject
Input [6]: [inv_item_sk#1, inv_quantity_on_hand#3, inv_date_sk#4, w_warehouse_name#7, i_item_sk#8, i_item_id#9]
Arguments: [inv_quantity_on_hand#3, inv_date_sk#4, w_warehouse_name#7, i_item_id#9], [inv_quantity_on_hand#3, inv_date_sk#4, w_warehouse_name#7, i_item_id#9]

(14) Scan parquet spark_catalog.default.date_dim
Output [2]: [d_date_sk#11, d_date#12]
Batched: true
Location [not included in comparison]/{warehouse_dir}/date_dim]
PushedFilters: [IsNotNull(d_date), GreaterThanOrEqual(d_date,2000-02-10), LessThanOrEqual(d_date,2000-04-10), IsNotNull(d_date_sk)]
ReadSchema: struct<d_date_sk:int,d_date:date>

(15) CometFilter
Input [2]: [d_date_sk#11, d_date#12]
Condition : (((isnotnull(d_date#12) AND (d_date#12 >= 2000-02-10)) AND (d_date#12 <= 2000-04-10)) AND isnotnull(d_date_sk#11))

(16) CometBroadcastExchange
Input [2]: [d_date_sk#11, d_date#12]
Arguments: [d_date_sk#11, d_date#12]

(17) CometBroadcastHashJoin
Left output [4]: [inv_quantity_on_hand#3, inv_date_sk#4, w_warehouse_name#7, i_item_id#9]
Right output [2]: [d_date_sk#11, d_date#12]
Arguments: [inv_date_sk#4], [d_date_sk#11], Inner, BuildRight

(18) CometProject
Input [6]: [inv_quantity_on_hand#3, inv_date_sk#4, w_warehouse_name#7, i_item_id#9, d_date_sk#11, d_date#12]
Arguments: [inv_quantity_on_hand#3, w_warehouse_name#7, i_item_id#9, d_date#12], [inv_quantity_on_hand#3, w_warehouse_name#7, i_item_id#9, d_date#12]

(19) ColumnarToRow [codegen id : 1]
Input [4]: [inv_quantity_on_hand#3, w_warehouse_name#7, i_item_id#9, d_date#12]

(20) HashAggregate [codegen id : 1]
Input [4]: [inv_quantity_on_hand#3, w_warehouse_name#7, i_item_id#9, d_date#12]
Keys [2]: [w_warehouse_name#7, i_item_id#9]
Functions [2]: [partial_sum(CASE WHEN (d_date#12 < 2000-03-11) THEN inv_quantity_on_hand#3 ELSE 0 END), partial_sum(CASE WHEN (d_date#12 >= 2000-03-11) THEN inv_quantity_on_hand#3 ELSE 0 END)]
Aggregate Attributes [2]: [sum#13, sum#14]
Results [4]: [w_warehouse_name#7, i_item_id#9, sum#15, sum#16]

(21) RowToColumnar
Input [4]: [w_warehouse_name#7, i_item_id#9, sum#15, sum#16]

(22) CometColumnarExchange
Input [4]: [w_warehouse_name#7, i_item_id#9, sum#15, sum#16]
Arguments: hashpartitioning(w_warehouse_name#7, i_item_id#9, 5), ENSURE_REQUIREMENTS, CometColumnarShuffle, [plan_id=1]

(23) ColumnarToRow [codegen id : 2]
Input [4]: [w_warehouse_name#7, i_item_id#9, sum#15, sum#16]

(24) HashAggregate [codegen id : 2]
Input [4]: [w_warehouse_name#7, i_item_id#9, sum#15, sum#16]
Keys [2]: [w_warehouse_name#7, i_item_id#9]
Functions [2]: [sum(CASE WHEN (d_date#12 < 2000-03-11) THEN inv_quantity_on_hand#3 ELSE 0 END), sum(CASE WHEN (d_date#12 >= 2000-03-11) THEN inv_quantity_on_hand#3 ELSE 0 END)]
Aggregate Attributes [2]: [sum(CASE WHEN (d_date#12 < 2000-03-11) THEN inv_quantity_on_hand#3 ELSE 0 END)#17, sum(CASE WHEN (d_date#12 >= 2000-03-11) THEN inv_quantity_on_hand#3 ELSE 0 END)#18]
Results [4]: [w_warehouse_name#7, i_item_id#9, sum(CASE WHEN (d_date#12 < 2000-03-11) THEN inv_quantity_on_hand#3 ELSE 0 END)#17 AS inv_before#19, sum(CASE WHEN (d_date#12 >= 2000-03-11) THEN inv_quantity_on_hand#3 ELSE 0 END)#18 AS inv_after#20]

(25) Filter [codegen id : 2]
Input [4]: [w_warehouse_name#7, i_item_id#9, inv_before#19, inv_after#20]
Condition : (CASE WHEN (inv_before#19 > 0) THEN ((cast(inv_after#20 as double) / cast(inv_before#19 as double)) >= 0.666667) END AND CASE WHEN (inv_before#19 > 0) THEN ((cast(inv_after#20 as double) / cast(inv_before#19 as double)) <= 1.5) END)

(26) TakeOrderedAndProject
Input [4]: [w_warehouse_name#7, i_item_id#9, inv_before#19, inv_after#20]
Arguments: 100, [w_warehouse_name#7 ASC NULLS FIRST, i_item_id#9 ASC NULLS FIRST], [w_warehouse_name#7, i_item_id#9, inv_before#19, inv_after#20]

===== Subqueries =====

Subquery:1 Hosting operator id = 1 Hosting Expression = inv_date_sk#4 IN dynamicpruning#5
BroadcastExchange (30)
+- * ColumnarToRow (29)
   +- CometFilter (28)
      +- CometScan parquet spark_catalog.default.date_dim (27)


(27) Scan parquet spark_catalog.default.date_dim
Output [2]: [d_date_sk#11, d_date#12]
Batched: true
Location [not included in comparison]/{warehouse_dir}/date_dim]
PushedFilters: [IsNotNull(d_date), GreaterThanOrEqual(d_date,2000-02-10), LessThanOrEqual(d_date,2000-04-10), IsNotNull(d_date_sk)]
ReadSchema: struct<d_date_sk:int,d_date:date>

(28) CometFilter
Input [2]: [d_date_sk#11, d_date#12]
Condition : (((isnotnull(d_date#12) AND (d_date#12 >= 2000-02-10)) AND (d_date#12 <= 2000-04-10)) AND isnotnull(d_date_sk#11))

(29) ColumnarToRow [codegen id : 1]
Input [2]: [d_date_sk#11, d_date#12]

(30) BroadcastExchange
Input [2]: [d_date_sk#11, d_date#12]
Arguments: HashedRelationBroadcastMode(List(cast(input[0, int, false] as bigint)),false), [plan_id=2]


