How do I load text files into database that are greater than the 64 kb buffersize limit?

There is no 64kb limit in SQL Server. The limits for SQL strings are either at 8000 bytes for in-row data types (char, varchar, nchar, nvarchar, binary and varbinary) or 2 GB for LOB types (varchar(max), nvarchar(max) and varbinary(max)). The 64 kb limitation you see must come from something else, most likely from IIS upload or ASP or CLR processing limitations.

There is no 64kb limit in SQL Server. The limits for SQL strings are either at 8000 bytes for in-row data types (char, varchar, nchar, nvarchar, binary and varbinary) or 2 GB for LOB types (varchar(max), nvarchar(max) and varbinary(max)). The 64 kb limitation you see must come from something else, most likely from IIS upload or ASP or CLR processing limitations.

But you're not going to be able to process arbitrary length files like this. . Net will not load a large stream into a string using Stream.ReadToEnd() because the memory allocation won't succeed.

You are going to load the file in chunks and append each chunk into the database, using the LOB specific UPDATE table SET column.WRITE... syntax. P.S. Some responses recommend use of the old LOB types like text, ntext and image. Don't use those types with SQL Server 2008, they are deprecated and discouraged.

I do agree with varchar(max) but I don’t see any problem with reading the string. I tested reading and writing the text to the page with files up to 300 kb (I don’t expect anything larger) without any problem. As far as the suggestion using the UPDATE table using the .WRITE.

Are you suggesting that I "seed" the INSERT with some text and then come back and UPDATE the record using the .WRITE. Can you test it on your machine and show me the stored procedure that you are suggesting. Also if you don’t have time or can’t solve it can you suggest any other forum that might.

Thanks – Michael Sep 29 '09 at 13:12.

Use a TEXT column instead of a VARCHAR/CHAR column. If you need something even bigger than TEXT, or will be loading in binary files, look into BINARY/VARBINARY/IMAGE etc.

MSDN provides documentation on all of the available data types. For text files you'll probably want to use the TEXT type, for binary files use one of the Binary String types.

Changed type to text still can't read files larger than 64kb – Michael Sep 28 '09 at 0:48.

In addition to phoebus's response, if your working buffer is too small, or even smaller than the 64k, you can read in the first segment, update the text field with that, read another buffer and update text with text + new buffer and repeat until all data loaded.

Changed type to text still can't read files larger than 64kb. How do I change the working buffer size? Can you suggest a change to my code above?

Thanks, – Michael Sep 28 '09 at 0:49.

Databases are not made to store big files in it. Store the files on the harddisk instead and store the filenames into the database. If you still want to store them into the database anyway you can use a compression library like #ziplib to decrease file sizes (source code compresses very well) and use binary column types like phoebus proposes.

Ok I've been spanked, I know this is not best practices. I have greatly simplified the problem for this forum. The application is a rapid prototyping sales demo tool that combines code snippets and user input to vary the output.

Storing everything in a database (sql 2008 express) makes it easier to update, move and deploy. I need the data in text form to ease the search and replace functionality. Thanks – Michael Sep 28 '09 at 1:16.

I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.

Related Questions