Streaming Instead of Making Users Wait: A tryst in Golang

SHIVAM SOURAV JHA
5 min read1 day ago

--

When developing apps that handle file uploads, particularly bigger files such as XLSX documents, the time it takes to upload, process, and react can have a significant influence on the user experience.

Traditionally, the technique is to wait for the full procedure to complete before responding to the user. This might give consumers the impression that the program is slow or stuck, particularly if it takes a long time.

Interestingly in my recent project I had a similar tryst with xlsx upload. I would parse the excel sheet and send back the data after searching it from database. So again I had the same challenge, what to do now? Let users wait or send them everything at once?

I chose speed over accumulation and perhaps opened gates for new challenges.You can check out the backend implementation of this project on my GitHub repository and explore the UI live at Stocksight.

In this blog, I’ll show you how I used streaming to improve the user experience by delivering partial replies as the file is processed, preventing the user from having to wait for everything to be combined and delivered all at once.

The Traditional Approach

Typically, when processing file uploads, you would parse the entire form, upload files, process them, and then deliver a final answer. While this works well for little files, it creates substantial delays with bigger ones. Users look at a spinner, waiting for anything to happen; if there is a delay, they have no idea what’s going on.

Imagine have to scan 10 xlsx files, fetch data of all of those stocks (30*10=300 ), at least it would have a delay of 5–10 seconds in worst case scenario.

Furthermore, bulkier the response, the more likely it will break in between due to network issues.

This issue can be addressed with streaming, allowing the server to process chunks of data incrementally and send responses in real-time. Here’s how I applied this concept in a Gin-based application.

The Challenge

The main issue was that customers who uploaded many XLSX files had to wait until the full procedure (parsing, saving, and processing) was completed before receiving a response. This created excessive wait periods, particularly for big downloads.

How did I come upon the Solution?

Knowing the distinctions between HTTP/1.1 and HTTP/2 gave rise to the concept for streaming. Every file upload under HTTP/1.1 would need its own connection, which would cause inefficiencies and delays. However, multiplexing, which is introduced by HTTP/2, enables the transmission of many data streams over a single connection. This gave me the idea to use a similar streaming method to optimise the file upload process, giving real-time feedback while files are processed.

The Streaming Solution

I employed a streaming technique that provides the user with data in chunks rather than waiting for the full file to be uploaded and processed. In this manner, the application stays responsive and the user can observe that progress is being made.

Let’s break down the solution.

func (f *fileController) ParseXLSXFile(ctx *gin.Context) {
defer sentry.Recover()
transaction := sentry.TransactionFromContext(ctx)
if transaction != nil {
transaction.Name = "ParseXLSXFile"
}

span := sentry.StartSpan(context.TODO(), "ParseXLSXFile")
defer span.Finish()

// Parse the form and retrieve the uploaded files
form, err := ctx.MultipartForm()
if err != nil {
span.Status = sentry.SpanStatusFailedPrecondition
sentry.CaptureException(err)
ctx.JSON(400, gin.H{"error": "Error parsing form data"})
return
}

// Retrieve the files from the form
files := form.File["files"]
if len(files) == 0 {
ctx.JSON(400, gin.H{"error": "No files found"})
return
}

uploadDir := "./uploads"
if err := os.MkdirAll(uploadDir, os.ModePerm); err != nil {
span.Status = sentry.SpanStatusFailedPrecondition
sentry.CaptureException(err)
ctx.JSON(500, gin.H{"error": "Error creating upload directory"})
return
}

var savedFilePaths = make(chan string, len(files))

// For each file, open and save it on the server
for _, file := range files {
src, err := file.Open()
if err != nil {
span.Status = sentry.SpanStatusFailedPrecondition
sentry.CaptureException(err)
ctx.JSON(500, gin.H{"error": "Error opening file"})
return
}
defer src.Close()

filename := filepath.Base(file.Filename)
savePath := filepath.Join(uploadDir, filename)

dst, err := os.Create(savePath)
if err != nil {
span.Status = sentry.SpanStatusFailedPrecondition
sentry.CaptureException(err)
ctx.JSON(500, gin.H{"error": "Error creating file on server"})
return
}
defer dst.Close()

if _, err := io.Copy(dst, src); err != nil {
span.Status = sentry.SpanStatusFailedPrecondition
sentry.CaptureException(err)
ctx.JSON(500, gin.H{"error": "Error saving file"})
return
}

savedFilePaths <- savePath
}
close(savedFilePaths)

// Set headers for chunked transfer
ctx.Writer.Header().Set("Content-Type", "text/plain")
ctx.Writer.Header().Set("Cache-Control", "no-cache")
ctx.Writer.Header().Set("Connection", "keep-alive")

// Start processing the files concurrently
err = services.FileService.ParseXLSXFile(ctx, savedFilePaths)
if err != nil {
span.Status = sentry.SpanStatusFailedPrecondition
sentry.CaptureException(err)
ctx.JSON(500, gin.H{"error": err.Error()})
return
}

span.Status = sentry.SpanStatusOK
ctx.Writer.Write([]byte("\nStream complete.\n"))
ctx.Writer.Flush() // Ensure the final response is sent
}

Key Features of the Streaming Approach:

  1. Real-Time File Handling: I dealt with each file as it was received, rather than processing them all before replying. Instead of waiting for all the files to be processed, each one is processed and sent to the user in this manner.
  2. Channel-Based File Streaming: Each file’s progress was monitored via a channel (savedFilePaths). This made it possible for the system to store file paths in real time and supply them to the file parsing service.
  3. Chunked Transfer Encoding: The server notifies the client that the response will be provided in chunks by configuring the headers (Content-Type, Cache-Control, Connection). This makes streaming possible and guarantees that the connection will stay open while the data is being handled.
  4. Concurrent Parsing: The channel was used to parse and save files simultaneously, enabling quicker processing and feedback.

Final benefits of Streaming

  1. Decreased Waiting Time: Users are no longer have to wait for the processing of the entire file to finish. The customer receives real-time feedback as each file is processed.
  2. Better User Experience: Instead of waiting for a full answer, users can see that the server is operating and making progress.
  3. Scalability: By using this method, the server may manage big file uploads or numerous files without experiencing resource lockups.

Conclusion

I made the process of uploading files interactive and real-time by utilising streaming. This enhanced the application’s perceived performance and made it possible to handle large files or numerous file uploads more effectively. For any server that manages large file transactions, streaming responses rather than making users wait is a potent strategy that improves the user experience overall.

Consider using a streaming strategy where it makes sense if you want to increase the responsiveness of your online apps. It can have a profound impact!

--

--