MS SQL - Import large data using Management Studio only

Good morning!

Just looking for some direction here....

I am supporting a small web app and manage the database as well. I have full access to the web server, but only access to the DB via Management Studio. (Currently, no other access to the SQL box)

I have to update the system with fresh data twice a week. It appears that the only way I can import the data (flat file) into a table on the production server is via a script. It's a HUGE pain. (Generating a script from my local DB, connecting to the remote DB and running the script which contains the CREATE TABLE statement and about 15K INSERT statements.)

Just wondering what the best practice is here? Should I contact the DB Admin and ask for some remote access?

Any thoughts? Hoping to make this is quick, simple process.

Thanks!

We have UPDATE DATE columns on all our tables, thus when we have Data Import processes to another location we only need all the rows which are "newer than last time" ("last time" is probably best defined as "The MAX Update Date on the Target Database" - then if the target database is restored it will get everything-newer on the next run).

You'd also need something that will DELETE any rows that no longer exist in the SOURCE Database. One route for that is to export a list of all the PKeys in the Source database and delete from the Target and PKey not in that list.

Could you have a LINKED SERVER? That would enable you to query the remote DB to get all the data you need (preferably just changed-data, but you can run whatever query you like ... big queries will be slow of course)

One way to do this is with BCP to bulk-export from the Source Database, using NATIVE data format. You can then import with BCP on the Target Server, but you can also do it with SQL BULK commands - provided that the Target Server can physically see the file you exported (or you can copy that somewhere that the Target sever can see it)

1 Like

You should not need any access to the database server's console or RDP in order to do your work. Simply use the right tools on your desktop.

In what format is the data in the file? If it's csv or another well defined format, you can use bcp.exe, import wizard in SSMS or an SSIS package.

I've never done that because i assumed it would be slow ... I have a slow (ADSL) connection from my PC (home working) to any server I talk too remotely ...

... if I was in the same building as the server would there be much (any?) difference in performance using BCP to import a file comparing A) everything-on-the-server to B) BCP-and-File on a PC ?

I might change my behaviour when I physically visit the office - force-of-habit means I do stuff through Remote Desktop (to the Server) which is a pain compared to just doing stuff locally on my Laptop where I have all my scripts and everything set up "just how I like it" :slight_smile:

Use a VM or some other jump box if you are working from home!

Yes, it's a csv file. I've never used bcp before, any tips? :slight_smile:

Yep, it's a command line utility that let's you import and export data between SQL Server and formatted files, such as csv.

Here's an example:

bcp.exe Db1.dbo.Tbl1 in C:\temp\SomeFile.txt -c -t, -T -r\r\n -Sserver1

Thansk Tara. I'm sure I should know what that means, in my context, but I don't I'm afraid.

If I had a VM (at the server end) I would just Remote Desktop into that - is that what you are describing?

Seems to be that anything I Remote Desktop into is like working through a letterbox! Performance is fine, but lots of things are a PITA. All my source code files are local, for example, so I map a Remote Desktop "Resource" to my local drive (and sometimes that drive is on a server locally, not my PC, so that's an added hassle too ...). So I often Cut & Paste source code into SSMS (running in Remote Desktop at the remote end), and I run that, then I have to remember to save it back locally afterwards ...

... whereas when I work locally all changes are saved to my local folders, of course, so they get picked up by Version Control (SVN in my case).

When I work on a client's server, updating an SProc [bespoke to the client] I always export the source first and compare it to my local copy - just in case it has been changed remotely and I/someone forgot to "transfer" that version back locally so that it got into SVN. Anything I do locally doesn't have that problem (and for most clients we have security permissions such that I can connect my local SQL Tools direct to their server - so I don't often have "no local connection" Clients).

There must be a better way though! hence I'd appreciate making sure I understand the VM / jump-box possibilities.

Before I joined Brent's organization, we used our laptops as dumb terminals. Didn't install much on them. We'd use our laptops to RDP into a desktop or a VM that contained everything we needed. That desktop remained at work while we worked at home from our laptops. This is how the last 3 jobs were and seems to be a pretty standard practice. If instead our laptop was our main machine with everything installed, IT team would setup a jump host that we could RDP into when we were working from home so that we didn't need to RDP to servers. The only times I ever would RDP into servers was to do installations, reboots, etc. I would run Event Viewer, Performance Monitor, etc run another machine, since those tools allow you to connect to a remote server. You should never run SSMS on a production server unless it's an emergency. I do install the client tools there, but it's only for an emergency. Depending upon how lock pages in memory is setup, you could be locking your stuff in memory.

1 Like

Interesting point Tara. What are the key risks in doing that? I use SSMS once in a blue moon, and consequently that tends to be on the Server itself (its not install on my desktop - I'm still using Query Analyser for my day to day coding, and my understanding?? is that were I to install SSMS on my desktop that would Hose my QA capabilities.

Query Analyzer for your coding? Are you actually using Query Analyzer from SQL Server 2000? What version of SQL Server are you using?

I'd probably setup a virtual machine for your Query Analyzer needs and then have SSMS on your desktop.

Using SSMS on the server takes away valuable resources from production, most importantly memory. If you have Lock Pages in Memory setup for the Administrators and you are a local administrator, well your stuff is now locked in memory too. That's not what you want on a production server.

Some organizations don't even install the client tools on their servers. I do, but only to be used in an absolute emergency.

Thanks Tara.Makes sense that I do that on our servers. So rarely use SSMS on them that I've been lazy!

Yup, SQL 2012 (I think some servers may be 2014)

(I have looked, extensively, for SQL Tools that I like, but my efficiency in QA is still far higher that I have managed to achieve using other tools that I have tried. There's a thread here somewhere about the various things I've tried, if you would be interested I'll dig out a link for it)