Skip to main content
Filter by
Sorted by
Tagged with
-4 votes
0 answers
32 views

How can I open a 750MB xml file and extract some of the elements? [closed]

I tried to open the file from here (XEUR one) with MS access because I want to compare some elements (instruments) with some of mine. The issue is that all tables seem to have thousands of empty rows. ...
megapixel's user avatar
0 votes
0 answers
84 views

How to verify that two files are equal using Java AWS SDK

I am looking for the way how to compare two files (especially large files) in S3 within the same bucket using Java AWS SDK. I do not need to verify whole bucket if there are duplicates. As I ...
Bronek Kristal's user avatar
0 votes
0 answers
53 views

Speed up read access of large (~300mb) samples with H5py

I have a large .h5 file of high resolution images (~300MB each, 200 images per .h5 file) and need to load samples in python. The current setup uses a separate dataset for each sample. data_group....
gekrone's user avatar
  • 179
0 votes
0 answers
27 views

In VSCode, how do I quickly refresh large log files after being modified by an external program?

I run software that outputs log files with several hundreds of thousands of lines, the files weigh 100MB+ Everytime I re-run the software, the log file opened in VSCode doesn't refresh unless I close ...
المهدي الطالب's user avatar
0 votes
0 answers
19 views

I'm using an AWS instance (4GB RAM, 2vCPU) and struggling to download a 5–7GB file. The download code seems to block the event loop. Any solutions?

I have added my code below. Let me know if there are any issues. The request comes through a proxy in Next.js. try { const s3Provider = strapi.plugins.upload.provider; const ...
Avnish Kumar's user avatar
-1 votes
1 answer
81 views

Loading and processing large file (120Mb) in memory throws OOM when RAM is still available

I have a task to fix/improve a program that we have and constantly throws OOM exceptions when processing large files (60-90Mb) in memory. This program essentially an in house solution to move files ...
George's user avatar
  • 2,213
-3 votes
2 answers
140 views

How can I optimize a Python script to process large CSV files efficiently?

I'm working on a Python project that involves processing large CSV files (2–5 GB in size). The script reads the CSV file, performs data transformations, and writes the output to a new file. However, ...
Lior Dahan's user avatar
0 votes
0 answers
45 views

Git LFS Server Error when Uploading RTF File

I want to track a single .rtf file of 55 MB using this command: git lfs track "folderA/folderB/file.rtf" I've also tried adding in a --filename flag. After some time, the server errors out ...
BobDidley's user avatar
  • 199
1 vote
0 answers
74 views

PHP fgetcsv on a huge csv file, error SSL: Connection reset by peer

Php fgetcsv script to update a database by loading a csv file of 115000 rows but after about 111000 rows it stops with this error Warning: fgetcsv(): SSL: Connection reset by peer in "script ...
PicenoComputers's user avatar
0 votes
0 answers
62 views

Keep getting "Gcloud Pub/Sub: Total timeout of API google.pubsub.v1.Publisher exceeded 60000 milliseconds"

I am new to GCP and I've developed a simple Google Cloud Run service that reads a CSV file from a Google Cloud Storage bucket and pushes each row to Pub/Sub. The code functions well on the emulator ...
SAUH's user avatar
  • 11
0 votes
0 answers
368 views

Various issues with uploading large files to S3 via Strapi

I'm running an installation of Strapi v5.1.1 on an Ubuntu v24.10 server with Nginx 1.26. The Strapi app uses an S3 bucket for media file storage, and a MySQL database, hosted on a DigitalOcean Droplet....
slownames's user avatar
  • 171
1 vote
0 answers
111 views

write syscall with mmaped buffer

I work with large files and as a result of development I had to copy large files (about 10 gigabytes) in C code. In this regard, the question arose: how effective is the combination of map()+write() (...
complex's user avatar
  • 29
0 votes
1 answer
53 views

Ordering rows based on multiple conditions in R

I am working with a large dataset of over 1.3 million fish detection points. I am trying to calculate distances and swim speed based on the location and time stamp data. However, I noticed that random ...
anzac21's user avatar
  • 13
0 votes
0 answers
50 views

What is the ideal way to save a large and ever-growing encrypted logseq folder with version control?

I am using logseq together with cryptomator. I have a large graph now that I would very much not like to lose. Up to this point I have used git in the encrypted folder. The main issue right now is ...
xoux's user avatar
  • 3,504
0 votes
0 answers
42 views

django StreamingHttpResponse to return large files

I need to get pdf files from s3 and return the same file to the frontend. def stream_pdf_from_s3(request, file_key): s3_client = boto3.client('s3') try: response = s3_client....
Azima's user avatar
  • 4,161
3 votes
2 answers
110 views

Quickest way to iterate over a pandas dataframe and perform an operation on a column, when what the operation is depends on the row

I have a table that is laid out somewhat like this: t linenum many other columns 1234567 0 ... 1234568 0 ... 1234569 0 ... 1234570 1 ... 1234571 1 ... Except it is very, very large. As in, the raw ....
Réka's user avatar
  • 145
0 votes
1 answer
57 views

Different Starting Indices for Iterating over Large Files

I want to begin by saying I have NOT programmed in many, many years, so sorry if this is a somewhat trivial question. My interest has been mathematics for the last couple of years Here's my code: ...
wyboo's user avatar
  • 783
0 votes
1 answer
115 views

Read Large XML file from S3 Bucket and validate using XSD

I am creating a Lambda to validate an XML message against an XSD. I have all the python code working correctly on AWS, however as the file can be very large (around 2GB), I needed to find a way to ...
codemonkey's user avatar
-1 votes
1 answer
260 views

Stream Large CSV File Generated in SpringBoot API to browser throwing broken pipe exception

I'm working on an endpoint that queries a datasource and generates a large CSV file and streams it to the browser. I'm having an issue when the stream has sent ~2Gb through the stream this error is ...
cobolstinks's user avatar
  • 7,161
0 votes
1 answer
70 views

Minizip not working to compress file that has size > 4 GB

My program uses minizip(zlib- version 1.2.3) to compress an xml file. When the xml file size is greater than 4 GB(in fact the max value of int 32 - 4294967295) then I find that when I try to extract ...
user26530257's user avatar
0 votes
0 answers
8 views

Having Problem in pushing large files in git [duplicate]

There is an error coming everytime i push my file on github. I tried changing the version of https from 2 to 1.1 , I have also tried to track files using git lf, I have also added these files to the ....
Vedica Gairola's user avatar
0 votes
1 answer
61 views

VBA read 1st line of large utf8 csv

I tried to use th ADODB to read only the first line of a large utf8 file. for my purpose I need only the values of the first line in a csv (having the headers of a table) Public Sub stream() ...
NaHolla's user avatar
  • 103
0 votes
2 answers
194 views

How to parse multiple Large XML files with good performance and memory usage balance?

I'm implementing a program which should get files from the user in Angular, send them to the node.js backend, where there those will be read and parsed into an array of object. I went with javascript ...
Andrea Fantini's user avatar
0 votes
0 answers
49 views

HTTP 413 error uploading large file on apache

Using django-filer admin file manager to upload 2GB large file. On the same windows machine, works OK in vscode development environment, but return http 413 on apache ssl enabled http server. And ...
Jian Zheng's user avatar
1 vote
0 answers
41 views

In PHP. 1GP file takes one-hour time to download from FTP connection. How to increase the download speed?

I PHP fpt connection, I am trying to download one large file minimum 1GP from client server to my server but the downloading time taking minimum one-hour. But the server configuration and bandwidth ...
Raja N's user avatar
  • 21
0 votes
0 answers
92 views

Uploading Large WAV Sound Files into Github

I'm trying to upload large WAV files into a Github Repository that I have. They have music that will play in a game I'm designing. But Github won't let me upload them to the repository because they ...
Sean Creveling's user avatar
0 votes
0 answers
215 views

Uploading Large files(>10MB) to s3 through API Gateway

I have a lambda function which uploads excel files to S3. Since there is limit of 10MB for API Gateway. How can i Upload larger files? Is there a way other than creating a lambda function which ...
Mohan's user avatar
  • 1
2 votes
0 answers
74 views

Explain the benefit of streaming file data when savings to database in Microsoft example code?

I am looking for the best solution to upload large files to a MySQL database via an ASP.NET Core Web App using Entity Framework Core. I am looking at the Microsoft documentation *here, but I am ...
RyanO's user avatar
  • 51
0 votes
0 answers
51 views

Creating a precomputed cloudvolume from an HDF5 file

I have 3620 images in .tif format, totalling 1TB. Here's their info: Shape: (21796, 12876) dtype: uint16 I'm looking to use Neuroglancer to visualize this dataset. To achieve this, I have converted ...
MagicLudo's user avatar
0 votes
1 answer
322 views

How to stream large .zip files from Azure Blob Storage directly to the browser?

Requirement Support bulk download of files stored on Azure blob storage for the users either via .NET Core endpoint or through SAS URL from Azure Specifications Content Type? - application/zip How ...
Nirav Soni's user avatar
0 votes
0 answers
35 views

How to use the Dropbox API to upload a LARGE backup

I'm writing a script to copy a large website (29GB compressed) and I cannot seem to get it to upload via the Dropbox API. curl -T 05-07-2024_11-01PM_files_backup.tar.gz -X POST https://content....
Stuart Gray's user avatar
0 votes
0 answers
35 views

VSCode not opening large file

I need to open a file in VSCode the problem is that the files is 22,5GB and I am getting the oom error. It is always loading until it reaches 15GB RAM usages and than crashes. I have over 35GB free ...
JankoV's user avatar
  • 1
0 votes
0 answers
62 views

Delete all the columns from a very large file which satisfy a specific condition

I have a very large file of size approx. 5-10 GB. The file will have 2 types of lines: Lines starting with "#define CHAR_DEF" Lines starting with "% " Format of "% " ...
Ravi Patel's user avatar
1 vote
0 answers
58 views

Streaming large file through several services with no storing in memory

I have service "A" with controller method: @GetMapping("/file") public ResponseEntity<StreamingResponseBody> getPdf() throws IOException { File file = ...
Николай's user avatar
2 votes
1 answer
102 views

how to filter huge csv file by pandas

I have a 10GB csv file data/history_{date_to_be_searched}.csv. it has more than 27000 zip codes. On the basis of zip code I have to filter the csv file then each filtered file I have to upload to ...
zircon's user avatar
  • 930
2 votes
0 answers
44 views

Creative way to work with large model (potentially using excel macros/vba)

I created a 80.7MB excel file. The basic structure of the file is the user enters inputs on one sheet, formulas are run in a different sheet (hidden), and outputs are displayed on a third sheet. My ...
Alek Kevorkian's user avatar
0 votes
0 answers
83 views

Issues with processing large Excel files from SFTP using Roo gem in Rails application

I'm encountering difficulties processing large Excel files fetched from an SFTP server in my Rails application. The application comprises a Rails server and a Karafka server, and the application ...
user23353243's user avatar
0 votes
0 answers
92 views

R: efficient and fast splitting large data files in a directory by a variable and write out the files

I have come into a problem regarding how to fast and efficient read and split a list of very large transaction data files by a column called SecurityID, inside each transaction data file, there can be ...
ML33M's user avatar
  • 415
2 votes
1 answer
696 views

How do i open a file larger than available memory in visual studio?

In my pre-windows days, i recall using a hex editor that only loaded in the part of the file i was working on. I need to load a file that is larger than available memory into visual studio to search ...
Barny's user avatar
  • 151
0 votes
1 answer
105 views

Edit the final part of large(1.5gb) text file in NodeJS

My tool appends little json blocks with comma at their end into a txt file which initially has [ as the first character, in order to create a whole JSON format text file, like below, {data: text1}, ...
blueway's user avatar
  • 139
1 vote
1 answer
284 views

Python error: "ValueError: cannot switch from manual field specification to automatic field numbering" and I don't know why

I have been working on making a multiplication table using a function, a for loop and string formatting. I tried a small-scale test table going up to twenty, which printed fine. then, I went all the ...
mechricson's user avatar
0 votes
0 answers
130 views

c# Dividing into two images / stream to bitmap

I use Visual Studio 2022 C# language. (newbie...). I'd like to split one image into two (divide to half). The first image in the split image is input to first picutrebox and then the other second ...
RANG MIHO's user avatar
0 votes
0 answers
41 views

Waking large archived files to go OFF tape by trying to open them for 1 second

I have access to a tape archival system on my network. The files stored on this system are quite large, about 700-800MB each, and accessing the system from Windows shows them marked "Archive.&...
AKS's user avatar
  • 89
1 vote
1 answer
73 views

Can ProcessPoolExecutor work with yield generator in Python?

I have a python script aiming to process some a large file and write the results in a new txt file. I simplified it as Code example 1. Code example 1: from concurrent.futures import ...
Yihang Zhu's user avatar
1 vote
0 answers
216 views

How to download a 30GB file through JavaScript

I have a URL which is a 30GB of file, I want to create a Javascript function which will download the file in user system through chuncks by chunck (1GB). In simple term something like, 1GB of data ...
deepak vajpayee's user avatar
0 votes
1 answer
183 views

How can I disable large size warnings in JupyterLab?

Example of a large size warning in JupyterLab: I don't want to disable all warning, but only large size warnings.
Franck Dernoncourt's user avatar
1 vote
1 answer
488 views

How to fix "Cannot allocate vector of size..." when using filter-function? [closed]

In an university class, I need to work with a pretty big longitudinal data set: .rds-file is around 300mb, in total 380,000 observations of 5160 variables. The data set goes back to 1984, however I ...
Moritary 's user avatar
0 votes
1 answer
67 views

Large .pkl data for backend is not pushed in github

I am learing ML. recently, I have made movie recomandation model from tmdb dataset, I processed data using that model in .pkl (binary) file. made backend using that data but, data is too large to push ...
DEV MULIYA's user avatar
-4 votes
2 answers
109 views

apply sed only to the part of the file after last match in loop - shell / bash [closed]

I have a couple of large files (~1Gb) of such structure: fooA iug9wa fooA lauie fooA nwgoieb fooB wilgb fooB rqgebepu fooB ifbqeiu ... fooN ibfiygb fooN yvsiy fooN aeviu I would like to replace in ...
MartynaM's user avatar
2 votes
1 answer
271 views

create php code for large files openssl encrypt using AES-256-CTR compatible with openssl command line

I am trying to create a PHP method to replicate the OpenSSL command line function so I can encrypt using PHP and then decrypt it using command line . i created PHP method for encrypting files taking ...
Bilal H's user avatar
  • 33

1
2 3 4 5
35