Jump to content

Extension talk:AWS

About this board

No previous topics.

File not found Although this PHP script (/img_auth.php) exists, the file requested for output (mwstore://AmazonS3/local-public/Pipe_header.jpg) does not.

1
LuciferVN (talkcontribs)

I have followed all the steps, the extension is loaded but i am getting 500 error i don't know why ..

Maybe i might have not configured right ??

i have first created an inline policy to allow all the permissions for bucket Hello and then i have created IAM role and attached that inline policy to that and then attached that IAM role to the EC2 on which mediawiki is running

My s3 bucket is hello and it has a folder inside it called helloworld and in that folder i have all the images so my local settings should be like this right ....???

FILE STRUCTURE: hello/helloworld/....(all the images)

wfLoadExtension( 'AWS' );

$wgAWSBucketName = "hello";

$wgAWSBucketTopSubdirectory = "/helloworld";


Help I am new to this

Reply to "File not found Although this PHP script (/img_auth.php) exists, the file requested for output (mwstore://AmazonS3/local-public/Pipe_header.jpg) does not."

Access denied: Writing thumbnail to s3

1
Pyhotshot (talkcontribs)

Hi, I am running into an s3 permission issue when mediawiki is trying to PUT thumbnail to s3 using https url. Mediawiki container role has full access to s3, when i upload an image, it is getting uploaded to s3://<bucket>/name.jpg. But when it is trying to read back, it is trying to create a thumbnail and PUT in s3 thumb/ dir, but failing to do so.

How do i let mediawiki upload using a pre-signed s3 url(the way it is trying to download), to upload to thumb/ dir. From the below logs it is clearly trying to PUT to https bucket path, but it is not a signed URL, my s3 bucket only accepts signed https request. Please help!

2024-08-19 19:36:26 mediawiki-7dc7d47f89-nbj86 mediawikidb: S3FileBackend: doPrepareInternal: S3 bucket wonderfulbali8567, dir=thumb/Husky1.jpg, params=dir

2024-08-19 19:36:26 mediawiki-7dc7d47f89-nbj86 mediawikidb: S3FileBackend: isSecure: checking the presence of thumb/.htsecure in S3 bucket wonderfulbali8567

2024-08-19 19:36:26 mediawiki-7dc7d47f89-nbj86 mediawikidb: S3FileBackend: doCreateInternal(): saving thumb/Husky1.jpg/120px-Husky1.jpg in S3 bucket wonderfulbali8567 (sha1 of the original file: cwyxvni7t03ivhv6worr9duqucn8pyr, Content-Type: image/jpeg)

2024-08-19 19:36:26 mediawiki-7dc7d47f89-nbj86 mediawikidb: S3FileBackend: exception AccessDenied in createOrStore from PutObject (false): Error executing "PutObject" on "https://wonderfulbali8567.s3.amazonaws.com/thumb/Husky1.jpg/120px-Husky1.jpg"; AWS HTTP error: Client error: `PUT https://wonderfulbali8567.s3.amazonaws.com/thumb/Husky1.jpg/120px-Husky1.jpg` resulted in a `403 Forbidden` response:

<?xml version="1.0" encoding="UTF-8"?>

<Error><Code>AccessDenied</Code><Message>Access Denied</Message><RequestId>FPSPG8 (truncated...)

AccessDenied (client): Access Denied - <?xml version="1.0" encoding="UTF-8"?>

<Error><Code>AccessDenied</Code><Message>Access Denied</Message><RequestId>FPSPG8T46ABA0RG4</RequestId><HostId>QGbDRrFC20ZqZwllbnCB/M96zukfrbEi/cdSQNG7DF+MEjjMfIHf5I5VI0i1uplA+p5jTPwVb0M=</HostId></Error>

Reply to "Access denied: Writing thumbnail to s3"

Write existing files to S3 using extension

7
Pyhotshot (talkcontribs)

Has anyone written existing files from /images folder to s3, I have setup AWS extension recently and as i am uploading image, it is getting written to s3 storage now. But I have pre-existing images that i want to serve from s3 instead of images folder.

Also, it seems that mediawiki is storing images in the /images folder as well as in S3, afraid the storage limit will reach for the volume as it is storing in both volume and s3. how to avoid storing in volume and serve everything from s3?

Samwilson (talkcontribs)

To copy existing files to S3 you can use something like s3cmd or rclone. The extension doesn't have anything built in to do that transfer.

If it's writing to S3 correctly for new uploads, it shouldn't also be uploading those to the local images directory.

205.174.22.25 (talkcontribs)

Currently it does upload to both, there is a copy of images in EBS and uploaded to s3 in 3 different sizes(not sure if that is expected), what is the setting to use S3 only for the images?

Pyhotshot (talkcontribs)

Hello @Samwilson, You are right, it is not uploading to local images directory. I had it successfully upload images to s3 bucket. But I have few other unresolved questions/issues.

  1. I was able to mv existing images from images/ directory to s3, but how does mediawiki know the new address(s3 bucket) to serve those images from s3 instead of local dir.
  2. S3 images are not being served at all, what config should i set in LocalSettings.php to let mediawiki know that images should be served from s3 for both existing images and new images.
Samwilson (talkcontribs)

@Pyhotshot: Oh, sorry, I assumed that when you said you'd got it working to upload to S3 that it was also serving correctly from there. So you mean it uploads to the right place, but isn't serving anything? What URL is being produced for the images (i.e. have a look at the HTML at the img src attribute)? That should give you a clue as to which bit is going wrong. What values do you have set for $wgAWSRegion, $wgAWSBucketName, $wgAWSBucketDomain, and $wgImgAuthPath?

Pyhotshot (talkcontribs)

Yes, the images are not being served.

Here are the settings, Apart from these there are no other settings related to AWS S3.

wfLoadExtension( 'AWS' );

$wgAWSRegion = 'us-east-1'; # Northern Virginia

$wgAWSBucketName = "wonderfulbali234";

$wgArticlePath = "/wiki/$1";

$wgUsePathInfo = true;

Pyhotshot (talkcontribs)

https://<domainname>/wiki/File:Something1.jpg this is how it is showing in inspect html of the wiki page. saying "Error creating thumbnail: File missing" . The s3 path doesn't have wiki/ folder to it. <bucketname>/7/78/<somename>.jpg. The bucket prefix is similar to what was created by mediawiki to write to images/ folder ( autogenerated by mediawiki). is it expecting wiki/ folder to be in the bucket also?

Reply to "Write existing files to S3 using extension"
Song Ngư (talkcontribs)

Anyone has done it using Cloudflare R2?

It's almost similar to those other cloud, but it seems to be simpler.

Reply to "Cloudflare R2"

Does the bucket need to be public?

2
161.0.161.26 (talkcontribs)

Hello, I'm working with a private wiki and a private S3 bucket. To display the S3 images in the wiki, does the bucket need read access? Or can I use the extension while keeping everything private? Thanks!

Edward Chernenko (talkcontribs)

It doesn't need read access. Private wikis serve images via /img_auth.php, not directly.

Reply to "Does the bucket need to be public?"

Could not write file "mwstore://AmazonS3/local-public/xx.jpg"

3
Waterlooglass (talkcontribs)

I'm getting the above error. can anyone assist? Here is my LocalSettings (scrubbed for privacy)

$wgFileBackends['s3'];

wfLoadExtension( 'AWS' );

// Configure AWS credentials.

// THIS IS NOT NEEDED if your EC2 instance has an IAM instance profile.

$wgAWSCredentials = [

   'key' => xx,

   'secret' => 'xxx',

   'token' => false

];

$wgAWSRegion = 'us-east-1'; # Northern Virginia

// Replace <something> with the name of your S3 bucket, e.g. wonderfulbali234.

$wgAWSBucketName = "xxx";


and this is the policy we have


"Statement": [

{

"Effect": "Allow",

"Action": "s3:*",

"Resource": "arn:aws:s3:::<bucketname>*"

},

{

"Effect": "Allow",

"Action": [

"s3:Get*",

"s3:List*"

],

"Resource": "arn:aws:s3:::<bucketname>"

}

]

Waterlooglass (talkcontribs)

I removed the credentials from our LocalSettings file and tried to just use our IAM and now I'm getting this error


[746bd41fcda522fdafb85fb8] /wiki/Special:Upload Aws\Exception\CredentialsException: Error retrieving credentials from the instance profile metadata service. (cURL error 28: Connection timed out after 1001 milliseconds (see https://curl.haxx.se/libcurl/c/libcurl-errors.html) for http://169.254.169.254/latest/meta-data/iam/security-credentials/)

Ciencia Al Poder (talkcontribs)

> Connection timed out after 1001 milliseconds

This looks like a firewall is blocking a connection, or some URL is set incorrectly

Reply to "Could not write file "mwstore://AmazonS3/local-public/xx.jpg""

Could not write file "mwstore://AmazonS3/local-public/...

2
Ajmichels (talkcontribs)

(Also posted in GitHub issues for this repo)

I recently started getting these errors and I am struggling to figure out why.

Nothing has changed in my AWS configuration. The IAM configuration is still good and all of the bucket settings have not changed.

I am on PHP 7.4, MediaWiki 1.35, Extension:AWS 0.11.1. This hasn't really changed either.

I did recently update my composer dependencies. Per the MediaWiki documentation I removed my composer.lock file and ran composer install

Files are still being read from the bucket correctly.

Does anyone have troubleshooting suggests or know what the issue is?

I verified that the AWS credentials I am using are still working correctly. I also tried using the latest code from the extension's repo.

To be clear, this was working just fine a few weeks ago and the only thing that has changed since then was that I updated the composer dependencies and I enabled the VisualEditor functionality.

Here is the error I am seeing in the debug logs (some information obfuscated):

<Error><Code>AccessDenied</Code><Message>Access Denied</Message><RequestId>NM94VF (truncated...)
 AccessDenied (client): Access Denied - <?xml version="1.0" encoding="UTF-8"?>
<Error><Code>AccessDenied</Code><Message>Access Denied</Message><RequestId>NM94VF*******</RequestId><HostId>u6uU*************************************************</HostId></Error>
[error] [de39d4fe79d16409eda7a6cf] /wiki/Special:Upload   ErrorException from line 1104 of /var/www/html/extensions/AWS/s3/AmazonS3FileBackend.php: PHP Warning: doCreateInternal: S3Exception: Error executing "PutObject" on "******/Shopify_Photoshop_Actions.atn.zip"; AWS HTTP error: Client error: `PUT *******/Shopify_Photoshop_Actions.atn.zip` resulted in a `403 Forbidden` response:
<?xml version="1.0" encoding="UTF-8"?>
<Error><Code>AccessDenied</Code><Message>Access Denied</Message><RequestId>NM94VF (truncated...)
 AccessDenied (client): Access Denied - <?xml version="1.0" encoding="UTF-8"?>
<Error><Code>AccessDenied</Code><Message>Access Denied</Message><RequestId>NM94VF*******</RequestId><HostId>u6uU*************************************************</HostId></Error>
#0 [internal function]: MWExceptionHandler::handleError()
#1 /var/www/html/extensions/AWS/s3/AmazonS3FileBackend.php(1104): trigger_error()
#2 /var/www/html/extensions/AWS/s3/AmazonS3FileBackend.php(1031): AmazonS3FileBackend->logException()
#3 /var/www/html/extensions/AWS/s3/AmazonS3FileBackend.php(347): AmazonS3FileBackend->runWithExceptionHandling()
#4 /var/www/html/extensions/AWS/s3/AmazonS3FileBackend.php(369): AmazonS3FileBackend->doCreateInternal()
#5 /var/www/html/includes/libs/filebackend/FileBackendStore.php(187): AmazonS3FileBackend->doStoreInternal()
#6 /var/www/html/includes/libs/filebackend/fileop/StoreFileOp.php(74): FileBackendStore->storeInternal()
#7 /var/www/html/includes/libs/filebackend/fileop/FileOp.php(301): StoreFileOp->doAttempt()
#8 /var/www/html/includes/libs/filebackend/FileOpBatch.php(176): FileOp->attempt()
#9 /var/www/html/includes/libs/filebackend/FileOpBatch.php(132): FileOpBatch::runParallelBatches()
#10 /var/www/html/includes/libs/filebackend/FileBackendStore.php(1308): FileOpBatch::attempt()
...

And here are the versions Composer is using for the extension's dependencies:

  - Locking aws/aws-sdk-php (3.209.17)
  - Locking composer/installers (v1.12.0)
Ajmichels (talkcontribs)

The issue is that my Wiki and bucket are private and I did not have $wgFileBackends['s3']['privateWiki'] = true; in my local settings. I am still not sure yet how this was working before and then stopped but... it is working now.

Thanks to Edward for helping me figure it out on GitHub.

Reply to "Could not write file "mwstore://AmazonS3/local-public/..."

Who should be using this extension?

2
65.92.83.38 (talkcontribs)

What kind of wiki is this good for?

Kghbln (talkcontribs)
Reply to "Who should be using this extension?"

JSON for IAM Policy update

6
HyverDev (talkcontribs)

Been looking at this and it seems the JSON for the IAM role isn't correct anymore. Maybe amazon changed their grammar policy since the original entry this is what I have got to:

{
    "Statement": [
        {
            "Effect": "Allow",
            "Action": "s3:*",
            "Resource": "arn:aws:s3:::<something>/*"
        },
        {
            "Effect": "Allow",
            "Action": [
                "s3:Get*",
                "s3:List*"
            ],
            "Resource": "arn:aws:s3:::<something>"
        }
    ]
}
Edward Chernenko (talkcontribs)

Nothing changed. The example in the article was always supposed to be inserted into the Statement array. This is not a "replace IAM inline policy with this" example, because IAM inline policy may already exist (and contain other rules that shouldn't be overwritten).

MyWikis-JeffreyWang (talkcontribs)

I think the OP's sentiment is valid. Not sure why the documentation doesn't include this. It would make the setup less confusing to deal with for those who are new to S3/IAM.

MyWikis-JeffreyWang (talkcontribs)

Upon inspection, it did, but as a citation. Since this is very important, I've taken it out of a footnote.

DiscordiaChaos (talkcontribs)

Is there an example file that will work for someone who created a brand-new bucket just for this?

I'm asking about this due to hearing about increased security issues regarding AWS, and I want to keep things locked down while still enabling regular use of Mediawiki.

[Edited to make things more clear and reduce confusion]

MyWikis-JeffreyWang (talkcontribs)

@DiscordiaChaos The above JSON, in its exact form (apart from the ARN needing to be filled in), should be safe.

Reply to "JSON for IAM Policy update"

PDF support lacking

1
47.36.146.194 (talkcontribs)

Doesn't seem to work very well with the pdf thumbnail pages generated by default by mediawiki, this is probably just a function of the realities of generating of a pdf for a file on an s3 store just don't make a lot of sense and probably should be processed as a batch job than done on the fly like configured by default.

Reply to "PDF support lacking"