Executing very large script on Microsoft SQL Server 2008 R2 fails

I was trying to execute very large .sql script in Microsoft SQL Server 2008 R2, that runs on Microsoft Windows Server 2008 R2.

First I tried to copy/paste script into Microsoft SQL Server Management Studio. The server has plenty of memory about 5 GB RAM, and script file is about 150 MB in size and line count is about 1 million, so I do not see problem here. Copy/paste part went ok, however after executing script and after long wait, Management Studio surrendered with “Query completed with errors”:
Microsoft SQL Server Management Studio
Cannot execute script.
Insufficient memory to continue the execution of the program. (mscorlib)


Next I tried various combinations of T-SQL command line tools:
osql -S iDGZdb-dev -i test2.sql -U myuser
sqlcmd -S IDGZDB-DEV -i test2.sql -U myuser
osql -t 100000 -a 65535 ...

Still no success. Various errors appeared in console. Running locally on SQL server:
[SQL Server Native Client 10.0]
Shared Memory Provider: No process is on the other end of the pipe.
[SQL Server Native Client 10.0] Communication link failure

Running on remote machine:
[SQL Server Native Client 10.0]
TCP Provider: An existing connection was forcibly closed by the remote host.
[SQL Server Native Client 10.0]Communication link failure

Tried to search for Google and didn’t find any solution. There were many forums where users ask for help about this issue, but no single answer. Suggested “failed network equipment” is nonsense, if I run script locally, there should be no network involved.

Looked in the System event log under Application logs on the Windows Server where SQL Server resides. Luckily there was an error from the SQL Server:
Log Name: Application
Date: 2013.02.04. 17:37:41
Event ID: 7836
Task Category: Server
Level: Error
Keywords: Classic
User: N/A
Computer: iDGZdb-dev
A fatal error occurred while reading the input stream from the network. The maximum number of network packets in one request was exceeded. Try using bulk insert, increasing network packet size, or reducing the size of the request. The session will be terminated.
Event Xml:
<Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event">
<Provider Name="MSSQLSERVER" />
<EventID Qualifiers="49152">7836</EventID>
<TimeCreated SystemTime="2013-02-04T15:37:41.000000000Z" />
<Security />
<Binary>...some hex stuff...</Binary>

But unfortunately, the error reveals very little about the problem. Didn’t get this part of suggestion “… or reducing the size of the request”. The request of 150 MB is too big for the latest edition of MS SQL? Seriously?

Note: Ok, ok… Microsoft SQL Server 2012 is out, and I am not an expert in this area, however, if MySQL can handle scripts in size of gigabytes and Oracle too, then I must be doing something very wrong, but again, then where are all suggestions in the net? Even StackOverflow has only guesses about “network equipment”.

Next, I tried to increase the packet size as suggested in System log message. More statements were executed from the SQL script, however, it still stopped with a memory error:
There is insufficient system memory in resource pool 'internal' to run this query.
Network packet size (bytes): 32576

How I solved this? I surrendered, and instead of wasting time with weird T-SQL error messages, I wrote a little C# script, that executes my file line by line. All of my SQL statements were one-liners. It took me about 1/2 hour to write this script and execute it.

Here is relevant code fragment:
SqlConnection connection = new SqlConnection(Properties.Settings.Default.YourConnectionString);
System.IO.StreamReader file = new System.IO.StreamReader(yourFilename);
while ((line = file.ReadLine()) != null)
        SqlCommand command = new SqlCommand(line, connection);
    catch (Exception ex)
        LogErrorSomewhere(ex.Message, line);


P.S. If you know, how to execute a such script using standard SQL tools, then you are welcome to leave me a comment.

13 replies on “Executing very large script on Microsoft SQL Server 2008 R2 fails”

  1. I had an similar problem executing a script file that contained ~ 100 MB insert queries. The proper solution using SQL Tools was the sqlcmd.exe

    Open the window command prompt (cmd) and run the following:

    sqlcmd -S YOURSQLSERVER\INSTANCENAME -i c:\your_script_file.sql -o c:\your_log_file.log

  2. Thanks Maris for creating this blog post and Toni for sqlcmd command. I used the sqlcmd command and it worked like a charm! You may need to specify username and password as below.

    sqlcmd -S server -U username -P password -i scriptfile.sql -o logfile.log

  3. Toni and Sinan ILYAS,

    Thanks for your comments. Today I had an opportunity to run big query again. File size is about 150MB, and your provided commands do not work. Small script works, but when it becomes bigger, then SQL Server end with an error:
    HResult 0xE9, Level 16, State 1
    Shared Memory Provider: No process is on the other end of the pipe.
    Sqlcmd: Error: Microsoft SQL Server Native Client 10.0 : Communication link failure.

    This is a fresh Windows and SQL Server install. With default settings it simply does not work. All other SQL tasks are working as expected.


  4. Thank you mery much man,I had the same same with a big script file (250 MB)
    Your solution worked perfectly and saved me a lot of time!!!

  5. I had exactly the same problem. For me , using

    osql -S . -d Test -E -i script.sql sort of worked. When I say sort of…it worked but took a LONG time.

  6. Thanks Maris for creating this blog post and Toni for sqlcmd command .
    I used the sqlcmd it worked seamlessly on my 150MB sql file

  7. This post actually holds numerous fixes to the issue depending on several factors.

    In my case, sqlcmd with -a 32767 was the key. My file was 1 Gig. Nothing would open it until I added that -a switch. The reason changing the packet size didn’t work for you is your amount of RAM. when I increased the packet size I got a 40x increase in memory usage during processing of my file. Server went from 7 gigs used to 44 gigs used, JUST to process one file. I can only assume that you would have run out of memory even on a 150mb file with only 5 gigs of RAM. (40x 0.150 = 6gigs minimum).

    Either way, your code only takes the file one line at a time, perfect to avoid memory issues. Thanks for the ideas!

  8. Brad,

    No, and if I could, it still requires Connection to your DB.
    However, I can suggest you to get a free C# compiler from Microsoft and compile the code by yourself.

  9. Running my 75MB sql file using sqlcmd still throws the same error – There is insufficient system memory in resource pool ‘internal’ to run this query. What can be done to handle this?

Comments are closed.