Unable to copy files larger than 2GB

Using NetDrive version 3.18.1020

Unable to copy files larger than 2GB.
Files shown on GCP bucket after attempted copy all show up with their size in parenthesis like so with the exact same size: (2,097,151) KB.

Robocopy shows this error: “error retry limit exceeded request could not be performed because of an i/o device error”
Local copies and Windows File Explorer shows this error: “Error 0x8007045D: the request could not be performed because of an I/O device error.”

Hi there, have you been able to determine what the fix for this could be? Thanks in advance.

Dear it_user_f2ed,

I will check it with GCP and give you feedback.

Regards.

Dear it_user_f2ed,

I tested this by creating a drive item from NetDrive to Google Cloud Storage. I checked and files over 4GB were uploaded without any problems.
My guess is that your Google Cloud Storage has exceeded its quota and request limits. Please try increasing your quota and request limits.

Regards

Can’t find any place to change the quota or limit in GCP. We checked the API’s listed and their quotas but didn’t see anything referring to a 2GB limitation nor saw any request limits. We’re using a bucket created within there. If I turn off “Use background uploading” and “Use on-the-fly uploading in Explorer” and then check the “Uploading” area within NetDrive, you’ll noticed that it shows the size as x,xxGB/2.00GB) oddly, even though the file we try to copy is 3.84 GB in size (maybe that 2.00GB shown is a clue hmm). It shows that same error and then in the file browser it only shows “(2,097.151) KB” beside the copied file after it fails.

Dear it_user_f2ed,

The URL below is a guide to managing quotas for Google Cloud Storage. Please refer to that guidance page.

Also, could you please attach the log files so that we can better understand the issue.

  1. Please set the log level to VERBOSE
  2. Reproduce the problem.
  3. Afterward, please follow the instructions in the following link to send us the debug log file:

Please be aware that the log file does not contain any login credentials. We apologize for any inconvenience this may cause.

Regards

Hi there,

Interesting information: From the computer where NetDrive is installed…

– If I try to copy files larger than 2 GB (like the 3.8 GB one for testing using normal drag-n-drop or robocopy etc) it fails with those previous errors I sent in the start of this topic. However the File Explorer in Windows shows the partially copied/failed file with the size in parenthesis like so: “(2,097,151) KB”. No matter what large file I sent, the same thing happens with the same KB shown in parenthesis. Also, as previously mentioned from that last screenshot you’ll see how NetDrive’s upload shows the size as (x,xxGB/2.00GB) while copying.

– Interestingly enough, if I used Googles gcloud CLI application then it successfully sends the full 3.8 GB file over. However, oddly it reports the size like show in File Explorer: “(163,627) KB”. But if I check it from within the GCP console it shows it correctly as 3.8 GB. I’ll send the logs next.


Here are the logs.
nd3svc_Archive-JAX.log.zip (4.4 MB)

Dear it_user_f2ed,

That’s strange, I just connected to Google Cloud Storage with S3 Compatible and tested uploading with a file of about 4GB.
But it uploaded without any problems. The screenshots below are of the upload progress, the completed upload in Windows Explorer, and the uploaded file from Google Cloud Storage.

My question is, do files smaller than 2GB upload normally?

I will analyze the log file you attached and let you know.

Regards

Yes, interesting screenshots of yours indeed. And yes, smaller than 2GB files upload normally. Thanks again, I look forward to your analysis, I’ll keep trying things too.

I just realized something. Oddly, even though we’re using Google Cloud Storage (in the form of buckets), we’ve got the drives mounting with the “Storage Type” of “S3”. S3 is for AWS thus shouldn’t we be using the “Google Cloud Storage” type instead? Maybe this has been the problem all along.

However, when I try to use that type and click “Connect” I get this error:
Sign in with Google:
Access blocked: Authorization Error
The OAuth client was not found.
Error 401: invalid_client

Note: I also checked and made sure that I could open a browser and be logged in with my GCP credentials prior to trying to. I bet that I need to login to the Bdrive and register each bucket in the Cloud Service Secrets area after I do things in the OAuth consent screen area in GCP for each bucket.

Maybe that “Storage Type” choice isn’t the issue. As I do see an article of yours saying “You can your Google Cloud Storage as a S3-compatible storage. If you access your Google Cloud Storage as a S3-compatible storage it does not requires OAuth authentication.” It was here: Use Google Cloud Storage as S3

Also, I’m curious what OS did you do your test from? We’re using Windows 2022 Server.

Also, interestingly no matter if I have it mount as a local drive or network drive its unable to show the correct sizes in the File Explorer file listing if a file is greater than 2GB. The GCP Console shows the full size correctly.

Dear it_user_f2ed,

I analyzed the logfile you attached and didn’t find any clues that could be the problem.

You can access Google Cloud Storage from NetDrive in two ways. You can connect by selecting either Google Cloud Storage or S3 Compatible as the storage type. In the screenshot above, you can see that the first time I accessed Google Cloud Storage (using OAuth) and the second time I accessed S3 Compatible (using Access key, Secret), so it doesn’t seem to be a storage type issue.

If you get an Authorization Error when connecting the storage type to Google Cloud Storage, it’s likely due to the permission settings. Please check your permissions settings.

And I’m testing on Windows 10. Can you also try uploading a file larger than 2GB in NetDrive’s File Browser? If it uploads normally in NetDrive’s File Browser, it’s likely a problem with Windows File Explorer.

I’ll test this on Windows 2022 Server and let you know.

Regards

Hi there. Two things discovered so far:
a) I get the same errors and issues when testing a newly made Windows 10 Pro 22H2 VM with the S3 storage type in-use.

b) NetDrive’s File Browser works on Windows 2022 and the Windows 10! :slight_smile: Thus I can upload files larger than 2GB and see the correct file sizes.

I’m going to try a few more things as the goal is to be able to copy files larger than 2GB to it via UNC paths.

Hi there, any other suggestions? Thanks

Dear it_user_f2ed,

Sorry for late response.

If you search for “google cloud storage can’t upload more than 2gb”, it seems to happen a lot.

Could you download and test following version?
This is a changed filesystem version. I don’t know if this will solve the problem, but if it doesn’t, it seems to be an issue with Google Cloud Storage.
https://downloads.bdrive.com/netdrive/builds/c63848fea9fb4a56be35d0b0b1d60d47/NetDrive3_Setup-3.17.1019.exe

Regards

Nice! That version (3.17.1019) worked :slight_smile:
– The Windows File Browser now shows file sizes correctly
– And I’m able to copy files larger than 2GB

Questions:

  1. Before this, you had given me this version (3.18.1020), hopefully it didn’t have any fixes in it that the one you just gave me doesn’t - hence the newer version number?

  2. I used this method to upgrade to a newer version, is this okay?
    To upgrade…

  • Dismount any mounted volumes
  • Uninstall previous version
  • Reboot
  • Delete these folders:
    C:\Program Files (x86)\Bdrive
    C:\ProgramData\NetDrive3
  • Then install the latest version

Dear it_user_f2ed,

  1. Both are the latest versions. However, the versions I gave you earlier (3.18.1020) and (3.17.1019) use different file systems. 3.18.1020 is the file system we’re going to adopt going forward, and it looks like there was an issue. Now that I’ve realized it’s a file system issue, I’ll fix it and update it.

  2. yes, that’s okay.

If you have any issue or a question, don’t hesitate to contact us.

Regards

Fantastic, thank you very much for assisting me through all this. Take care