You are browsing a read-only backup copy of Wikitech. The live site can be found at wikitech.wikimedia.org

Uploading large files: Difference between revisions

From Wikitech-static
Jump to navigation Jump to search
imported>Quiddity
m (fixes)
imported>Reedy
(→‎Step 1: download files: +seemingly missing word)
 
(One intermediate revision by one other user not shown)
Line 16: Line 16:


At this stage, it could be pertinent to check the hash of the file if known.
At this stage, it could be pertinent to check the hash of the file if known.
Requestors are advised to provide direct link to the to-upload-file. However, ocasionally, they do not do that, and use a public cloud service instead, which usually don't provide direct download links (like Google Drive).
From Google Drive, it is possible to download a file using it's unique ID via [[:en:rclone]]:
<syntaxhighlight lang="bash">
urbanecm@titanium  /nas/urbanecm/wmf-rehosting
$ rclone -P backend copyid <config>: '<fileid>' '<filename>'
</syntaxhighlight>
where:
* <config> refers to the name of the rclone config entry (you can use <code>rclone config</code> to see/edit the config entries)
* <fileid> is the ID of the file at Google (for https://drive.google.com/file/d/1K9QrMXyhPqlvc-vQRjVmT8YrbgYjfelC/view, the ID is <code>1K9QrMXyhPqlvc-vQRjVmT8YrbgYjfelC</code>
* <filename> is the name you want to store the file under
Since rclone is not installed at production servers, this requires copying the file first to a temporary location first and then transfering to the maintenance server, however, it does not mean downloading the file to administrator's own laptop (which might have capacity or connection speed issues).


== Step 2: import image to Commons ==
== Step 2: import image to Commons ==

Latest revision as of 13:57, 20 December 2021

Requirement

You need:

  • the URL of the media file to upload
  • a text file with the first revision content
  • the name of the user account for this first revision and upload

MediaWiki currently doesn't support files greater than 4 GB (as size is stored as a 32 bits unsigned integer) while our swift backend storage is limited to 5 Gb. See phab:T191804 and phab:T191802 for discussion to extend this limit respectively to 5 GB and beyond.

Step 1: download files

Download the files to mwmaint1002 (or if there's not enough space, deploy1001).

wget <URL>

At this stage, it could be pertinent to check the hash of the file if known.

Requestors are advised to provide direct link to the to-upload-file. However, ocasionally, they do not do that, and use a public cloud service instead, which usually don't provide direct download links (like Google Drive).

From Google Drive, it is possible to download a file using it's unique ID via en:rclone:

urbanecm@titanium  /nas/urbanecm/wmf-rehosting
$ rclone -P backend copyid <config>: '<fileid>' '<filename>'

where:

Since rclone is not installed at production servers, this requires copying the file first to a temporary location first and then transfering to the maintenance server, however, it does not mean downloading the file to administrator's own laptop (which might have capacity or connection speed issues).

Step 2: import image to Commons

Server-side uploads run much faster because of minimal network overhead, and as a result can cause extra strain on the job queue, especially videos which require transcoding. It's recommended to add some delay in between each upload with the --sleep parameter. Because videos have various factors (resolution, fps, length) that would affect how long transcodes would take, it might be worth uploading one video, seeing how long the median transcode takes, and then sleeping for that length to avoid queuing up a large number of transcodes.

mwscript importImages.php --wiki=commonswiki --sleep=SECONDS --comment-ext=txt --user=USERNAME /tmp/uploads