Skip to main content

Laravel S3 uploads failing when uploading large files in Laravel 5.8

Working on a project for a customer using Laravel 5.8, I found that when uploading files to S3 using Laravel’s Fly System, that the uploads were failing after a certain size.
The error back was a validation error mentioning the file was unable to be uploaded.

To troubleshoot this, there were a few things I did:

  1. Ensured my PHP.INI settings were allowing uploads high enough. I found that PHP.INI only allows 2M by default, so I increased this limit to 20M. I also increased the max post time to 300.
  2. I found out that Amazon S3 doesn’t like files over 5 MB uploading in one go, instead you need to stream it through

Streaming the files

My solution was to have the file upload to the server first, and once it is uploaded, it is streamed from the server to S3.

 

Read More

Using S3 with Laravel 5.8

With any web application, file uploads are a royal pain in the ass.

Storing files on your web server is not the best way to go about things, especially with Laravel.

Ideally, you can have the below setup:

  1. Website code on github
  2. Laravel site hosted using Laravel Forge
  3. Nightly MYSQL backups
  4. Files hosted on s3

With this setup, your site is pretty scalable and if your web host goes down on S3, you can easily copy over to a new server with Laravel Forge, and then simply load up the site again, as all of your files are referenced to Amazon S3.

The below snippet is a simple way to take a file request from a page, validate it for types of files, then save to S3, to a folder called “customer uploads”. You’ll notice I’m also making the file public, so it can be accessed publicly. You can remove the ‘public’ argument if you do not want it to be public.

This function will save the file to S3, then return the full URL to the file on Amazon S3, so you can store it in your database and it can be accessed.

public function uploadAttachment(Request $request)
{

	$file = $request->file('file');

	$request->validate([
	'file' => 'required|mimes:pdf,jpeg,jpg,png,gif',
	]);

	// Save to S3
	$savetoS3 = Storage::disk('s3')->put('customer-uploads', $file,'public');

	// Return file path
	return response()->json( Storage::disk('s3')->url($savetoS3) );

}

If you were to make files private, but still want them downloadable, say through an admin area, you have to do things a bit differently.

You have to save the file as a private file in S3, then store in your database a reference to it.
In the example below, I create a model called “FileLibrary” and have an ID and URL column.
Store the downloaded file in the database.

use Illuminate\Support\Facades\Storage;

public function uploadAttachment(Request $request)
{

	$file = $request->file('file');

	$request->validate([
	'file' => 'required|mimes:pdf,jpeg,jpg,png,gif',
	]);

	// Save to S3
	$savetoS3 = Storage::disk('s3')->put('customer-uploads', $file);

        
        $file = new FileLibrary;
        $file->url = $savetoS3;
        $file->save();

	// Return file ID in database
	return response()->json( $file->id );

}

Then you can create a route which points a get request to the below method.
This will request the file from S3 then download it to your browser.
The great thing about this method is you can restrict downloads to authenticated users.


use Illuminate\Support\Facades\Storage;

public function downloadFile($id)
{
	$file = FileLibrary::findOrFail($id);
	return Storage::disk('s3')->download($file->url);
}