== Physical Plan ==
* HashAggregate (30)
+- * ColumnarToRow (29)
   +- CometColumnarExchange (28)
      +- RowToColumnar (27)
         +- * HashAggregate (26)
            +- * ColumnarToRow (25)
               +- CometProject (24)
                  +- CometBroadcastHashJoin (23)
                     :- CometProject (18)
                     :  +- CometBroadcastHashJoin (17)
                     :     :- CometProject (12)
                     :     :  +- CometBroadcastHashJoin (11)
                     :     :     :- CometProject (7)
                     :     :     :  +- CometBroadcastHashJoin (6)
                     :     :     :     :- CometFilter (2)
                     :     :     :     :  +- CometScan parquet spark_catalog.default.store_sales (1)
                     :     :     :     +- CometBroadcastExchange (5)
                     :     :     :        +- CometFilter (4)
                     :     :     :           +- CometScan parquet spark_catalog.default.store (3)
                     :     :     +- CometBroadcastExchange (10)
                     :     :        +- CometFilter (9)
                     :     :           +- CometScan parquet spark_catalog.default.customer_demographics (8)
                     :     +- CometBroadcastExchange (16)
                     :        +- CometProject (15)
                     :           +- CometFilter (14)
                     :              +- CometScan parquet spark_catalog.default.customer_address (13)
                     +- CometBroadcastExchange (22)
                        +- CometProject (21)
                           +- CometFilter (20)
                              +- CometScan parquet spark_catalog.default.date_dim (19)


(1) Scan parquet spark_catalog.default.store_sales
Output [7]: [ss_cdemo_sk#1, ss_addr_sk#2, ss_store_sk#3, ss_quantity#4, ss_sales_price#5, ss_net_profit#6, ss_sold_date_sk#7]
Batched: true
Location: InMemoryFileIndex []
PartitionFilters: [isnotnull(ss_sold_date_sk#7), dynamicpruningexpression(ss_sold_date_sk#7 IN dynamicpruning#8)]
PushedFilters: [IsNotNull(ss_store_sk), IsNotNull(ss_cdemo_sk), IsNotNull(ss_addr_sk), Or(Or(And(GreaterThanOrEqual(ss_sales_price,100.00),LessThanOrEqual(ss_sales_price,150.00)),And(GreaterThanOrEqual(ss_sales_price,50.00),LessThanOrEqual(ss_sales_price,100.00))),And(GreaterThanOrEqual(ss_sales_price,150.00),LessThanOrEqual(ss_sales_price,200.00))), Or(Or(And(GreaterThanOrEqual(ss_net_profit,0.00),LessThanOrEqual(ss_net_profit,2000.00)),And(GreaterThanOrEqual(ss_net_profit,150.00),LessThanOrEqual(ss_net_profit,3000.00))),And(GreaterThanOrEqual(ss_net_profit,50.00),LessThanOrEqual(ss_net_profit,25000.00)))]
ReadSchema: struct<ss_cdemo_sk:int,ss_addr_sk:int,ss_store_sk:int,ss_quantity:int,ss_sales_price:decimal(7,2),ss_net_profit:decimal(7,2)>

(2) CometFilter
Input [7]: [ss_cdemo_sk#1, ss_addr_sk#2, ss_store_sk#3, ss_quantity#4, ss_sales_price#5, ss_net_profit#6, ss_sold_date_sk#7]
Condition : ((((isnotnull(ss_store_sk#3) AND isnotnull(ss_cdemo_sk#1)) AND isnotnull(ss_addr_sk#2)) AND ((((ss_sales_price#5 >= 100.00) AND (ss_sales_price#5 <= 150.00)) OR ((ss_sales_price#5 >= 50.00) AND (ss_sales_price#5 <= 100.00))) OR ((ss_sales_price#5 >= 150.00) AND (ss_sales_price#5 <= 200.00)))) AND ((((ss_net_profit#6 >= 0.00) AND (ss_net_profit#6 <= 2000.00)) OR ((ss_net_profit#6 >= 150.00) AND (ss_net_profit#6 <= 3000.00))) OR ((ss_net_profit#6 >= 50.00) AND (ss_net_profit#6 <= 25000.00))))

(3) Scan parquet spark_catalog.default.store
Output [1]: [s_store_sk#9]
Batched: true
Location [not included in comparison]/{warehouse_dir}/store]
PushedFilters: [IsNotNull(s_store_sk)]
ReadSchema: struct<s_store_sk:int>

(4) CometFilter
Input [1]: [s_store_sk#9]
Condition : isnotnull(s_store_sk#9)

(5) CometBroadcastExchange
Input [1]: [s_store_sk#9]
Arguments: [s_store_sk#9]

(6) CometBroadcastHashJoin
Left output [7]: [ss_cdemo_sk#1, ss_addr_sk#2, ss_store_sk#3, ss_quantity#4, ss_sales_price#5, ss_net_profit#6, ss_sold_date_sk#7]
Right output [1]: [s_store_sk#9]
Arguments: [ss_store_sk#3], [s_store_sk#9], Inner, BuildRight

(7) CometProject
Input [8]: [ss_cdemo_sk#1, ss_addr_sk#2, ss_store_sk#3, ss_quantity#4, ss_sales_price#5, ss_net_profit#6, ss_sold_date_sk#7, s_store_sk#9]
Arguments: [ss_cdemo_sk#1, ss_addr_sk#2, ss_quantity#4, ss_sales_price#5, ss_net_profit#6, ss_sold_date_sk#7], [ss_cdemo_sk#1, ss_addr_sk#2, ss_quantity#4, ss_sales_price#5, ss_net_profit#6, ss_sold_date_sk#7]

(8) Scan parquet spark_catalog.default.customer_demographics
Output [3]: [cd_demo_sk#10, cd_marital_status#11, cd_education_status#12]
Batched: true
Location [not included in comparison]/{warehouse_dir}/customer_demographics]
PushedFilters: [IsNotNull(cd_demo_sk), Or(Or(And(EqualTo(cd_marital_status,M),EqualTo(cd_education_status,4 yr Degree         )),And(EqualTo(cd_marital_status,D),EqualTo(cd_education_status,2 yr Degree         ))),And(EqualTo(cd_marital_status,S),EqualTo(cd_education_status,College             )))]
ReadSchema: struct<cd_demo_sk:int,cd_marital_status:string,cd_education_status:string>

(9) CometFilter
Input [3]: [cd_demo_sk#10, cd_marital_status#11, cd_education_status#12]
Condition : (isnotnull(cd_demo_sk#10) AND ((((cd_marital_status#11 = M) AND (cd_education_status#12 = 4 yr Degree         )) OR ((cd_marital_status#11 = D) AND (cd_education_status#12 = 2 yr Degree         ))) OR ((cd_marital_status#11 = S) AND (cd_education_status#12 = College             ))))

(10) CometBroadcastExchange
Input [3]: [cd_demo_sk#10, cd_marital_status#11, cd_education_status#12]
Arguments: [cd_demo_sk#10, cd_marital_status#11, cd_education_status#12]

(11) CometBroadcastHashJoin
Left output [6]: [ss_cdemo_sk#1, ss_addr_sk#2, ss_quantity#4, ss_sales_price#5, ss_net_profit#6, ss_sold_date_sk#7]
Right output [3]: [cd_demo_sk#10, cd_marital_status#11, cd_education_status#12]
Arguments: [ss_cdemo_sk#1], [cd_demo_sk#10], Inner, ((((((cd_marital_status#11 = M) AND (cd_education_status#12 = 4 yr Degree         )) AND (ss_sales_price#5 >= 100.00)) AND (ss_sales_price#5 <= 150.00)) OR ((((cd_marital_status#11 = D) AND (cd_education_status#12 = 2 yr Degree         )) AND (ss_sales_price#5 >= 50.00)) AND (ss_sales_price#5 <= 100.00))) OR ((((cd_marital_status#11 = S) AND (cd_education_status#12 = College             )) AND (ss_sales_price#5 >= 150.00)) AND (ss_sales_price#5 <= 200.00))), BuildRight

(12) CometProject
Input [9]: [ss_cdemo_sk#1, ss_addr_sk#2, ss_quantity#4, ss_sales_price#5, ss_net_profit#6, ss_sold_date_sk#7, cd_demo_sk#10, cd_marital_status#11, cd_education_status#12]
Arguments: [ss_addr_sk#2, ss_quantity#4, ss_net_profit#6, ss_sold_date_sk#7], [ss_addr_sk#2, ss_quantity#4, ss_net_profit#6, ss_sold_date_sk#7]

(13) Scan parquet spark_catalog.default.customer_address
Output [3]: [ca_address_sk#13, ca_state#14, ca_country#15]
Batched: true
Location [not included in comparison]/{warehouse_dir}/customer_address]
PushedFilters: [IsNotNull(ca_country), EqualTo(ca_country,United States), IsNotNull(ca_address_sk), Or(Or(In(ca_state, [CO,OH,TX]),In(ca_state, [KY,MN,OR])),In(ca_state, [CA,MS,VA]))]
ReadSchema: struct<ca_address_sk:int,ca_state:string,ca_country:string>

(14) CometFilter
Input [3]: [ca_address_sk#13, ca_state#14, ca_country#15]
Condition : (((isnotnull(ca_country#15) AND (ca_country#15 = United States)) AND isnotnull(ca_address_sk#13)) AND ((ca_state#14 IN (CO,OH,TX) OR ca_state#14 IN (OR,MN,KY)) OR ca_state#14 IN (VA,CA,MS)))

(15) CometProject
Input [3]: [ca_address_sk#13, ca_state#14, ca_country#15]
Arguments: [ca_address_sk#13, ca_state#14], [ca_address_sk#13, ca_state#14]

(16) CometBroadcastExchange
Input [2]: [ca_address_sk#13, ca_state#14]
Arguments: [ca_address_sk#13, ca_state#14]

(17) CometBroadcastHashJoin
Left output [4]: [ss_addr_sk#2, ss_quantity#4, ss_net_profit#6, ss_sold_date_sk#7]
Right output [2]: [ca_address_sk#13, ca_state#14]
Arguments: [ss_addr_sk#2], [ca_address_sk#13], Inner, ((((ca_state#14 IN (CO,OH,TX) AND (ss_net_profit#6 >= 0.00)) AND (ss_net_profit#6 <= 2000.00)) OR ((ca_state#14 IN (OR,MN,KY) AND (ss_net_profit#6 >= 150.00)) AND (ss_net_profit#6 <= 3000.00))) OR ((ca_state#14 IN (VA,CA,MS) AND (ss_net_profit#6 >= 50.00)) AND (ss_net_profit#6 <= 25000.00))), BuildRight

(18) CometProject
Input [6]: [ss_addr_sk#2, ss_quantity#4, ss_net_profit#6, ss_sold_date_sk#7, ca_address_sk#13, ca_state#14]
Arguments: [ss_quantity#4, ss_sold_date_sk#7], [ss_quantity#4, ss_sold_date_sk#7]

(19) Scan parquet spark_catalog.default.date_dim
Output [2]: [d_date_sk#16, d_year#17]
Batched: true
Location [not included in comparison]/{warehouse_dir}/date_dim]
PushedFilters: [IsNotNull(d_year), EqualTo(d_year,2001), IsNotNull(d_date_sk)]
ReadSchema: struct<d_date_sk:int,d_year:int>

(20) CometFilter
Input [2]: [d_date_sk#16, d_year#17]
Condition : ((isnotnull(d_year#17) AND (d_year#17 = 2001)) AND isnotnull(d_date_sk#16))

(21) CometProject
Input [2]: [d_date_sk#16, d_year#17]
Arguments: [d_date_sk#16], [d_date_sk#16]

(22) CometBroadcastExchange
Input [1]: [d_date_sk#16]
Arguments: [d_date_sk#16]

(23) CometBroadcastHashJoin
Left output [2]: [ss_quantity#4, ss_sold_date_sk#7]
Right output [1]: [d_date_sk#16]
Arguments: [ss_sold_date_sk#7], [d_date_sk#16], Inner, BuildRight

(24) CometProject
Input [3]: [ss_quantity#4, ss_sold_date_sk#7, d_date_sk#16]
Arguments: [ss_quantity#4], [ss_quantity#4]

(25) ColumnarToRow [codegen id : 1]
Input [1]: [ss_quantity#4]

(26) HashAggregate [codegen id : 1]
Input [1]: [ss_quantity#4]
Keys: []
Functions [1]: [partial_sum(ss_quantity#4)]
Aggregate Attributes [1]: [sum#18]
Results [1]: [sum#19]

(27) RowToColumnar
Input [1]: [sum#19]

(28) CometColumnarExchange
Input [1]: [sum#19]
Arguments: SinglePartition, ENSURE_REQUIREMENTS, CometColumnarShuffle, [plan_id=1]

(29) ColumnarToRow [codegen id : 2]
Input [1]: [sum#19]

(30) HashAggregate [codegen id : 2]
Input [1]: [sum#19]
Keys: []
Functions [1]: [sum(ss_quantity#4)]
Aggregate Attributes [1]: [sum(ss_quantity#4)#20]
Results [1]: [sum(ss_quantity#4)#20 AS sum(ss_quantity)#21]

===== Subqueries =====

Subquery:1 Hosting operator id = 1 Hosting Expression = ss_sold_date_sk#7 IN dynamicpruning#8
BroadcastExchange (35)
+- * ColumnarToRow (34)
   +- CometProject (33)
      +- CometFilter (32)
         +- CometScan parquet spark_catalog.default.date_dim (31)


(31) Scan parquet spark_catalog.default.date_dim
Output [2]: [d_date_sk#16, d_year#17]
Batched: true
Location [not included in comparison]/{warehouse_dir}/date_dim]
PushedFilters: [IsNotNull(d_year), EqualTo(d_year,2001), IsNotNull(d_date_sk)]
ReadSchema: struct<d_date_sk:int,d_year:int>

(32) CometFilter
Input [2]: [d_date_sk#16, d_year#17]
Condition : ((isnotnull(d_year#17) AND (d_year#17 = 2001)) AND isnotnull(d_date_sk#16))

(33) CometProject
Input [2]: [d_date_sk#16, d_year#17]
Arguments: [d_date_sk#16], [d_date_sk#16]

(34) ColumnarToRow [codegen id : 1]
Input [1]: [d_date_sk#16]

(35) BroadcastExchange
Input [1]: [d_date_sk#16]
Arguments: HashedRelationBroadcastMode(List(cast(input[0, int, true] as bigint)),false), [plan_id=2]


