Hello I have a problem with reading my data files. At the moment I am reading line by line using: Do While objReader.Peek() <> -1 Code (markup): Now I want to read the files in chunks (of say 32KB). The full process of what I want to do is below: Read file in chunks -> for each line insert that line into a datatable. But I believe reading the file in chunks is quickest. Can anyone give me an example of how to do this. Thank you.
It's not fast at all, since you have to scan the data character by character, using VB. Reading it line by line uses machine language to scan the stream, so it's a lot faster. If you want to see, open the file as a binary file, read it into a byte array that's 2k in size and scan for vbNewLines or vbCRs. Write lines to the database. Do both on a large (a few MB) file and time them, and you'll see the difference.
you have absolutely no freaking idea what you are talking about. when you read a file, and you want 32KB chunks, you OPEN the file, and you are left with a pointer to a file. if you want to read to the next line, the function/program reads EACH character until it hits the new line characters. when you read a file, and you want 32KB chunks, you Open the File, and you get the pointer, and you add 32768 to the pointer. And there is a function that is built in to do it for you. File.OpenRead and then on the filestream FileStreamVar.Read Ps, stop trolling people.