What is the best practice for decoupling a DataMart (SQL2014) which is filled several times a day by the DWH via ETL load. The goal is that the end user is not impaired in terms of performance during a load.
One option would be Scalable Shared Database. Or are there even more elegant solutions for this? Replicate, Cloud, Stretch DB ? More than 20 Fact-Tables; all Layer on one Server;
Scalable Shared Database, if it's even available anymore, has never been used by anyone I know of, and I've never heard anyone at Microsoft even mention it since it was first announced. From what I remember it was very clunky to set up and use.
Stretch DB is intended to offer archival storage for rarely accessed data. It's not used for improved performance, and can be incredibly expensive if you're not careful. You're unlikely to get a faster network than any disk option available.
ETL loads generally use table partitioning to offload I/O to other disks to minimize impact. Basically load into an empty table with the same partition scheme, then switch the new partition in. Take a look here, especially on the aligned index requirements:
Another relatively easy and inexpensive performance improvement are solid state disks, if you're not already using them, or you have 50-100 TB or more data to load.