Hello Community,
The following T-SQL executes with no problems in MS SQL Server
SELECT MakeName, SUM(Cost) AS TotalCost
FROM Data.Make AS MK INNER JOIN Data.Model AS MD
ON MK.MakeID = MD.MakeID
INNER JOIN Data.Stock AS ST ON ST.ModelID = MD.ModelID
WHERE DateBought BETWEEN
CAST(YEAR(DATEADD(m, -1, GETDATE())) AS CHAR(4))
+ RIGHT('0' + CAST(MONTH(DATEADD(m, -1, GETDATE()))
AS VARCHAR(2)),2) + '01'
AND EOMONTH(DATEADD(m, -1, GETDATE()))
GROUP BY MakeName
I have converted the code to work with Spark.SQL as follows:
SELECT
Make.MakeName
,SUM(SalesDetails.SalePrice) AS TotalCost
FROM Make
INNER JOIN Model
ON Make.MakeID = Model.MakeID
INNER JOIN Stock
ON Model.ModelID = Stock.ModelID
INNER JOIN SalesDetails
ON Stock.StockCode = SalesDetails.StockID
INNER JOIN Sales
ON SalesDetails.SalesID = Sales.SalesID
WHERE Stock.DateBought BETWEEN cast(year(add_months(current_date(),-1)) as CHAR(4)), add_months(cast(current_date() as CHAR(2)),-1) AND last_day(add_months(current_date(),-1))
GROUP BY MakeName
However, I'm getting the following error:
Error in SQL statement: ParseException:
mismatched input 'FROM' expecting (line 4, pos 0)
== SQL ==
SELECT
Make.MakeName
,SUM(SalesDetails.SalePrice) AS TotalCost
FROM Make
^^^
INNER JOIN Model
ON Make.MakeID = Model.MakeID
INNER JOIN Stock
ON Model.ModelID = Stock.ModelID
INNER JOIN SalesDetails
ON Stock.StockCode = SalesDetails.StockID
INNER JOIN Sales
ON SalesDetails.SalesID = Sales.SalesID
WHERE Stock.DateBought BETWEEN cast(year(add_months(current_date(),-1)) as CHAR(4)), add_months(cast(current_date() as CHAR(2)),-1) AND last_day(add_months(current_date(),-1))
GROUP BY MakeName
Now, before I get screamed out from someone from this group, I know this question should be targetted at someone from a spark.sql forum, but I can't get any help from the group.
I was just wondering if someone could just take a look at the error and figure out where I might be going wrong .. you may not need to be an expert in Spark.SQL
Any insights will be appreciated.