Fix performance of database 60GB with json strings

The json data is (as you can see on the table picture) inside the web log.
What you suggest is the application write the log data to the MongoDB and everything else to the SQL.
Again we would need to combine the info as, from the log, we are breaking down the data (id's, tranaction numbers etc) so we can find the data to the SQL.
So if we do that, we need some link between MongoDB and SQL server so we can use the data on both sides.

P.S. I'm not sure if we can get a go on something like this since this is a multi national company and I'm not sure about the budgets.

Presumably you need to ask yourself how much of the data in Requestbody you will actually use. (1%, 10%, 90%?)

One would really need to know a lot more about the system to make recommendations - not something that can be done on a forum. I only floated the idea as a possibility.

Yes I understand.
Anyhow it will probably be 30-40% of the requestbody and 5-10% of the other json columns.
The system is about 40-50SQL servers divided to 10 sub companies.We get the log data of these servers to the APILOG(for the web site, so not all SQL data, just internet sales)
Everything is Microsoft based.
Thanks for the suggests btw.

yeah MongoDB is a good choice for json but make sure you read the requirements. I am not sure you can use it as a linked server but you could create a REST API that marries the data from MongoDB and SQL by joining on a specific key. but this whole idea is a major scope creep.