How to combine data from multiple local SQL Express databases into a single database in Cloud

Hi
A Newbie here so be gentle!
I am helping a charity which has a number of shops. I have made a SQL EXpress based billing system that is working quite well.

What is required is - everyday for the sales data from each shop to be sent to a central cloud database so that various analysis can be run on the aggregated data.

Such a system is needed because the Internet is unreliable and not available 24/7- so a cloud only option is not practical.
There are half a dozen tables.
All the local databses are identical in structure.

One solution I thought of is to have a Temp Table for each Table and put all new transactions in the Temp Table as well as the normal Table and then find a way of pushing the temp Table to cloud to append the data.

I am using VBA and Excel to connect to the local MS SQL Express.

Any and all wisdom appreciated.

Is there a way to identify each shop in the database architecture?

yes Mike each table has Shop No Column - an Integer. So that we can always tell which shop that data belongs to and can analyse by shop.

Using a scripting language extract all pertinent data to csv file and push to a remote/cloud ftp shopnumber.csv. Then the cloud server will use those files to populate the cloud database

A pull mechanism might require you to expose all your shop ip and port to the world and does not sound sustainable especially if you spin up more shops = more configuration

A push model described above might be more suitable?

Thanks for the tip yosiasz. A couple of follow ups.

Can this process be automated so that the guys at the shop dont need to get into the server?
IF there is no net connection at the time, does it keep trying till it gets a connection?

I was looking to use Cloud SQL - so I can put the CSV in the Bucket and somehow the CloudSQL will run it to add to the MasterDB

I will need to google and Youtube to tool up to do this.

Yes it most definitely has to be automated. Just schedule it in windows task

Remember not to expose user name and passwords used to push to cloud.
Maybe have it run every 15 min.

Or maybe even a backup file? And or transction log file and you restore to cloud per store? And you most definitely may want to use some sort of zipping of file and password protect it in case pc is stolen from store . Credit card # etc

Thanks yosiaz. Last week I tried to setup CloudSql.Made the buckets. Backed up the DB, put it in the bucket and tried to restore it in Cloud - but kept getting "multiple databases- one one database can be restored". It was only one databse I backed up.

At the outer edges of my skillset!

hmm, does it allow restore of MS SQL database backups?

What was the backup command you used? You may have backed up the database to an existing file - without init which would then append the backup in that file and you will get multiple backups in a single file.

Hi Jeff - I used this option.

you should really create an automated database scheduled backup, part of SQL maintenance if it exists for sql express. this manual approach is not sustainable. what if tomorrow you acquire 100 more stores, will you be doing this manual backup ? I don't think that is wise

Yosiasz - Yes that's the aim. Trying to get there step by step. You are right manually is not possible even in the short term.

I can see how to query data added since a specific date. This then needs to be sent to the cloud database. There are no conflict issues as each Branch code+key makes a record unique.

The steps after that I need help with understanding.

As a first step I tried to backup the local DB and tried to restore it to Cloud SQL. I am not particularly wedded to CloudSQL. Any cloudbased solution would do - I just need data from all the stores to arrive in one place.

I need the local DBs as the net connectivity is erratic.

I thought this would be a common requirement but seems not. Most of the resources are about migrating from local to cloud.

In that case maybe db backup and restore might not be the route. There is also azure, or amazon cloud but the latter could get costly

Maybe the csv to ftp then restore might be an option then an etl like process

The requirement you have is not uncommon

Azure - cant back up from MS Server Express. The regular version is unaffordable.
From the above discussion it seems stages are:

  1. Query and save as CSV daily new data - there are 6 tables
  2. When has net connection Send that CSV to a Google Cloud Bucket
  3. Append that data to relevant Cloud SQLtables:
    i. check bucket if new file there
    ii. INSERT query with the CSV
    iii- delete file in the bucket

First, I need to figure how how to do each of these manually. and then try and automate it.

Is this the right plan?

You also have to think of other DMLs

DELETEs
UPDATEs

Updates are easy but deletes could be hard depending on how you do deletes

Yes. But thankfully we will not have deletes - once in it should not be deleted. Maybe its status would be changed to Inactive.

The other pages of the backup dialog will have the options - by default those options append data to the file.

In the dialog - there is a script button. Use that to script out the actions to a new query window so you can see exactly what command will be executed.

Based on what I see here - you backed up to an existing file multiple times - which places each backup as a 'new' entry in the file. When restoring from that type of file you have to tell SQL Server which of those entries in the file to use - which CloudSQL doesn't seem to recognize.

Since this is Express Edition - you are going to need to automate this outside SQL Server. I would recommend that you download dbatools (https://dbatools.io/) - with these tools you can then use Powershell to automate the backups. You can then use PS to automate FTP to send the files to your cloud solution.

1 Like

Thanks for pointing the way Jeff. I will give it a try.

OK. I downloaded. I am new to both of these, so there is a learning curve! Thanks for getting me started.