Jump to content

Topic on Extension talk:AWS/Flow

Write existing files to S3 using extension

7
Pyhotshot (talkcontribs)

Has anyone written existing files from /images folder to s3, I have setup AWS extension recently and as i am uploading image, it is getting written to s3 storage now. But I have pre-existing images that i want to serve from s3 instead of images folder.

Also, it seems that mediawiki is storing images in the /images folder as well as in S3, afraid the storage limit will reach for the volume as it is storing in both volume and s3. how to avoid storing in volume and serve everything from s3?

Samwilson (talkcontribs)

To copy existing files to S3 you can use something like s3cmd or rclone. The extension doesn't have anything built in to do that transfer.

If it's writing to S3 correctly for new uploads, it shouldn't also be uploading those to the local images directory.

205.174.22.25 (talkcontribs)

Currently it does upload to both, there is a copy of images in EBS and uploaded to s3 in 3 different sizes(not sure if that is expected), what is the setting to use S3 only for the images?

Pyhotshot (talkcontribs)

Hello @Samwilson, You are right, it is not uploading to local images directory. I had it successfully upload images to s3 bucket. But I have few other unresolved questions/issues.

  1. I was able to mv existing images from images/ directory to s3, but how does mediawiki know the new address(s3 bucket) to serve those images from s3 instead of local dir.
  2. S3 images are not being served at all, what config should i set in LocalSettings.php to let mediawiki know that images should be served from s3 for both existing images and new images.
Samwilson (talkcontribs)

@Pyhotshot: Oh, sorry, I assumed that when you said you'd got it working to upload to S3 that it was also serving correctly from there. So you mean it uploads to the right place, but isn't serving anything? What URL is being produced for the images (i.e. have a look at the HTML at the img src attribute)? That should give you a clue as to which bit is going wrong. What values do you have set for $wgAWSRegion, $wgAWSBucketName, $wgAWSBucketDomain, and $wgImgAuthPath?

Pyhotshot (talkcontribs)

Yes, the images are not being served.

Here are the settings, Apart from these there are no other settings related to AWS S3.

wfLoadExtension( 'AWS' );

$wgAWSRegion = 'us-east-1'; # Northern Virginia

$wgAWSBucketName = "wonderfulbali234";

$wgArticlePath = "/wiki/$1";

$wgUsePathInfo = true;

Pyhotshot (talkcontribs)

https://<domainname>/wiki/File:Something1.jpg this is how it is showing in inspect html of the wiki page. saying "Error creating thumbnail: File missing" . The s3 path doesn't have wiki/ folder to it. <bucketname>/7/78/<somename>.jpg. The bucket prefix is similar to what was created by mediawiki to write to images/ folder ( autogenerated by mediawiki). is it expecting wiki/ folder to be in the bucket also?

Reply to "Write existing files to S3 using extension"