You are browsing a read-only backup copy of Wikitech. The live site can be found at wikitech.wikimedia.org

Reprepro: Difference between revisions

From Wikitech-static
Jump to navigation Jump to search
imported>Arturo Borrero Gonzalez
(→‎Updating external repositories: add another example)
imported>Arturo Borrero Gonzalez
(→‎Automatically import files from an incoming/ directory: drop false information, what I wrote doesn't really work)
(24 intermediate revisions by 17 users not shown)
Line 5: Line 5:
Reprepro maintains an internal database (a .DBM file) of the contents of the repository, which makes it quite fast and efficient.
Reprepro maintains an internal database (a .DBM file) of the contents of the repository, which makes it quite fast and efficient.


It's installed from the Debian package <code>reprepro</code>, and is configured using the files in <code>/srv/wikimedia/conf/</code>. Setting
It's installed from the Debian package <code>reprepro</code>, and is configured using the files in <code>/srv/wikimedia/conf/</code>. Running


  REPREPRO_BASE_DIR=/srv/wikimedia
  sudo -i reprepro <etc..etc>


might be quite useful. One example is https://phabricator.wikimedia.org/diffusion/OPUP/browse/production/modules/admin/files/home/akosiaris/.variables;e86a9f707eadfef8bbfe23e314324f13750a1e36$3 and using
will make your life better.


  sudo -E reprepro <etc..etc>
to make life better.
== HOWTO ==
== HOWTO ==
This section explains the most commonly needed actions/tasks involving reprepro. reprepro is running on <code>install1002.wikimedia.org</code>.
This section explains the most commonly needed actions/tasks involving reprepro. reprepro is running on <code>apt1001.wikimedia.org</code>.


{{Warning|content=When modifying reprepro content adding/upgrading/removing packages, always log it in the Server Admin Log (SAL)}}
{{Warning|content=When modifying reprepro content adding/upgrading/removing packages, always log it in the Server Admin Log (SAL)}}
=== Browse packages ===
https://apt-browser.toolforge.org/


=== List all package versions in the repositories ===
=== List all package versions in the repositories ===
For a given package name, use
For a given package name, use
  reprepro ls ''packagename''
  reprepro ls PACKAGE_NAME


For example:
For example:
<pre>
<pre>
# reprepro ls puppet
# reprepro ls puppet
puppet | 2.6.1-0ubuntu1~ppa1~hardy3 hardy-wikimedia | amd64, source
puppet | 4.8.2-5~trusty1 |  trusty-wikimedia | amd64, i386, source
puppet | 2.6.1-0ubuntu1~ppa1~lucid1 lucid-wikimedia | amd64, source
puppet |  4.8.2-5~bpo8+1 jessie-wikimedia | amd64, i386, source
puppet | 3.8.5-2~bpo8+2 jessie-wikimedia | amd64, i386, source
puppet |        4.8.2-5 | stretch-wikimedia | amd64, i386, source
</pre>
</pre>


Line 33: Line 35:


To see all packages in a given distribution, use
To see all packages in a given distribution, use
  reprepro list ''distribution-name''
  reprepro list DISTRIBUTION_NAME


To find all packages in all repositories, use
To find all packages in all repositories, use
Line 40: Line 42:


=== Building an unmodified third-party package for import ===
=== Building an unmodified third-party package for import ===
# set up a Labs instance for building and select the role::package::builder role. You could use the VMs in the packaging project.
# set up a [[Portal:Cloud_VPS | Cloud VPS]] instance for building and select the <code>role::package::builder</code> role. You could use the VMs in the packaging project.
# Alternatively if you are ops, just logging into boron.eqiad.wmnet. It is set up in exactly the same way but is faster and is already set up.
# Alternatively if you are ops, just log into deneb.codfw.wmnet. It is set up in exactly the same way but is faster and is already set up.
# fetch the package source to that instance and review it. dget, wget, curl are all valid options, dget is probably the faster.
# fetch the package source to that instance and review it. dget, wget, and curl are all valid options; dget is probably the fastest.
# build the package for the target distribution using https://phabricator.wikimedia.org/diffusion/OPUP/browse/production/modules/package_builder/README.md$30
# build the package for the target distribution using https://phabricator.wikimedia.org/diffusion/OPUP/browse/production/modules/package_builder/README.md$30
# copy the build to the repo server (i.e. host with the ''install-server'' roles, install1002/2002 on 20170214). On boron there is an rsync server that can be used directly from install1002. Example sudo rsync -va boron.eqiad.wmnet::pbuilder-result/trusty-amd64/apertium* .
# copy the build to the repo server (i.e. host with the ''install-server'' roles, apt1001/2001 on 20190403). On deneb there is an rsync server that can be used directly from apt1001. Example <code>sudo rsync -va deneb.codfw.wmnet::pbuilder-result/trusty-amd64/apertium* .</code>
# proceed with reprepro as documented below
# proceed with reprepro as documented below


Line 51: Line 53:
# import the package into git (usually operations/debs/SOURCEPACKAGENAME)
# import the package into git (usually operations/debs/SOURCEPACKAGENAME)
# apply your patch(es)
# apply your patch(es)
# set up a Labs instance for building and select the role::package::builder role.  
# set up a Cloud VPS instance for building and select the <code>role::package::builder</code> role.  
# Alternatively if you are ops, just logging into boron.eqiad.wmnet. It is set up in exactly the same way but is faster and is already set up
# Alternatively if you are ops, just logging into deneb.codfw.wmnet. It is set up in exactly the same way but is faster and is already set up
# check out your package on the labs host
# check out your package on the Cloud VPS host
# build the package for the target distribution using https://phabricator.wikimedia.org/diffusion/OPUP/browse/production/modules/package_builder/README.md$30
# build the package for the target distribution using https://phabricator.wikimedia.org/diffusion/OPUP/browse/production/modules/package_builder/README.md$30
# copy the build to the repo server (i.e. host with the ''install-server'' roles, install1002/2002 on 20170214).On boron there is an rsync server that can be used directly from install1002. Example sudo rsync -va boron.eqiad.wmnet::pbuilder-result/trusty-amd64/apertium* .
# copy the build to the repo server (i.e. host with the ''install-server'' roles, apt1001/2001 on 20190403).On deneb there is an rsync server that can be used directly from apt1001. Example <code>sudo rsync -va deneb.codfw.wmnet::pbuilder-result/trusty-amd64/apertium* .</code>
# proceed with reprepro as documented below
# proceed with reprepro as documented below


=== Automatically import files from an <code>incoming/</code> directory ===
=== Automatically import files from an <code>incoming/</code> directory ===
{{note| There is a proposal to make this procedure even more automatic: [[phab:T215812 | phab T215812 - reprepro: automate incoming processing]]}}
Reprepro can automatically import packages from an upload directory, as long as all the package fields are setup correctly with the right distribution and component names. It's also vital that the <code>.changes</code> files are present. When all these conditions are met, and all these files have been uploaded to <code>/srv/wikimedia/incoming</code> (e.g. using <code>dupload</code>), you can use:
Reprepro can automatically import packages from an upload directory, as long as all the package fields are setup correctly with the right distribution and component names. It's also vital that the <code>.changes</code> files are present. When all these conditions are met, and all these files have been uploaded to <code>/srv/wikimedia/incoming</code> (e.g. using <code>dupload</code>), you can use:
# reprepro processincoming default
 
<syntaxhighlight lang=shell-session>
user@apt1001:~ $ sudo -i reprepro processincoming default
</syntaxhighlight>


It uses the rules defined in the file <code>/srv/wikimedia/conf/incoming</code>
It uses the rules defined in the file <code>/srv/wikimedia/conf/incoming</code>
Line 68: Line 75:
It's best to check whether the <code>/srv/wikimedia/incoming/</code> directory is empty after using <code>procesincoming</code>, because reprepro should have moved/deleted all imported files. Any remaining files have not been processed.
It's best to check whether the <code>/srv/wikimedia/incoming/</code> directory is empty after using <code>procesincoming</code>, because reprepro should have moved/deleted all imported files. Any remaining files have not been processed.


=== Importing packages ===
After importing the packages you likely need to export the packages for them to be available via apt-get:
It's always best to have reprepro fully manage all package aspects using the <code>.changes</code> that was created during the build of the package (e.g. using [[Pbuilder]]). When the <code>.changes</code> file is present along with all files list therein, reprepro can handle it all with the <code>reprepro include</code> command:
# reprepro -C component-name include wikimedia-distribution-name path-to-.changes-file


For example:
<syntaxhighlight lang=shell-session>
user@apt1001:~ $ sudo -i reprepro export
</syntaxhighlight>


# reprepro -C main include hardy-wikimedia php5-apc_3.1.3p1-1wm1_amd64.changes
=== Importing packages ===
You will need three things:
# '''Component'''
## <code>main</code>: for Wikimedia native packages, as well as Debian/Ubuntu packages that have had ''source-modifications''
## <code>universe</code>: for existing Ubuntu packages that just have been recompiled or backported for the given distribution.
# '''Distribution'''
## Usually: <code>stretch-wikimedia</code>, <code>jessie-wikimedia</code>, or <code>trusty-wikimedia</code>.  This is the distribution that the package has been compiled ''for'', and ''under''. This should match the field in the package's Changelog.
## If your package was specifically built for Wikimedia and does not have a distribution of <code>CODENAME-wikimedia</code> listed in the changes file, then you should force reprepro to accept a <code>CODENAME-wikimedia</code> distribution. Add <code>--ignore=wrongdistribution</code> flag to the reprepro command to do so.
# '''Changes file'''
## If building on the packaging server, the changes file is most likely in <code>/var/cache/pbuilder/result/CODENAME-amd64/</code> along with the <code>dsc</code>, <code>buildinfo</code>, <code>tar.xz</code>, <code>deb</code>, and <code>orig.tar.gz</code>.
## You will want to bring all of these files over to the reprepro server.


For ''component'', use the following as a guide:
It's best to have reprepro fully manage all package aspects using the changes file that was created during the build of the package (e.g. using [[Pbuilder]]). When the changes file is present along with all files list therein, reprepro can handle it all with:
# reprepro -C COMPONENT include DISTRIBUTION CHANGES_FILE


; main : for Wikimedia native packages, as well as Debian/Ubuntu packages that have had ''source-modifications''
For example:
; universe : for existing Debian/Ubuntu packages that just have been recompiled/''backported'' for the given distribution.


For ''distribution'', use the distribution that the package has been compiled ''for'', and ''under''. Usually, any given compiled package should be for ''one'' distribution only, e.g. ''hardy-wikimedia'' '''OR''' ''lucid-wikimedia''. This should match the field in the package's Changelog. Only in special circumstances a given package can be used under multiple distribution versions, .e.g. if the package contains only scripts (no binaries). '''Only do this if you've tested it and are sure this is working'''. In this case, include the package in the ''oldest'' distribution version, and then copy the package references using <code>reprepro copy</code> (see below).  If your package was
# reprepro -C main include stretch-wikimedia nagios-nrpe-server_3.0.1-3+deb9u1+wmf1_amd64.changes
specifically built for wikimedia and does not have a distribution of <code>*-wikimedia</code> listed in the <code>.changes</code> file, then you should force reprepro to accept a <code>*-wikimedia</code> distribution. (You'll probably want <code>precise-wikimedia</code>). Add the <code>--ignore=wrongdistribution</code> flag to the reprepro command to do so.


When no <code>.changes</code> file is available, for example because you didn't build the package yourself, you can use <code>reprepro includedsc</code> and <code>includedeb</code>:
When no changes file is available, for example because you didn't build the package yourself, you can use <code>includedsc</code> and <code>includedeb</code>:
  # reprepro -C universe includedsc lucid-wikimedia varnish_2.1.2-1.dsc
  # reprepro -C universe includedsc lucid-wikimedia varnish_2.1.2-1.dsc
  # reprepro -C universe includedeb lucid-wikimedia varnish_2.1.2-1_amd64.deb
  # reprepro -C universe includedeb lucid-wikimedia varnish_2.1.2-1_amd64.deb
Line 91: Line 107:


Be aware that reprepro will remove older versions of packages without asking. They are no longer available in the pool (<code>/srv/wikimedia/pool</code>) either. However, <code>/srv/wikimedia/</code> is backed up using Amanda on [[tridge]], and many packages should be available in subversion as well.
Be aware that reprepro will remove older versions of packages without asking. They are no longer available in the pool (<code>/srv/wikimedia/pool</code>) either. However, <code>/srv/wikimedia/</code> is backed up using Amanda on [[tridge]], and many packages should be available in subversion as well.
{{outdated-inline|note=Neither tridge nor WMF's Subversion exist anymore.}}


'''Missing orig tarball'''
=== Missing orig tarball ===


The original tarball should be listed in the <code>.changes</code> file when package is build with <code>-sa</code> (for <code>dpkg-buildpackage</code> and <code>debuild</code>) or <code>--debbuildopts -sa</code> (for <code>pdebuild</code>). If it is missing though, you can invoke <code>reprepro</code> with <code>--ignore=missingfile</code> which makes it look up the file in the current working directory.
The original tarball should be listed in the changes file when package is built.  Often times, this is accomplished with a build flag: <code>dpkg-buildpackage -sa</code>, <code>debuild -sa</code>, or <code>pdebuild -- --debbuildopts -sa</code>
 
If it is missing though, you can tell reprepro to ignore it and find it in the current working directory with the <code>--ignore=missingfile</code> flag.


=== Removing packages ===
=== Removing packages ===
Line 106: Line 125:
For example:
For example:
  # reprepro removesrc trusty-wikimedia openjdk-8
  # reprepro removesrc trusty-wikimedia openjdk-8
Removing a package from a specific component:
# reprepro -C thirdparty/elastic65 remove stretch-wikimedia kibana
=== Removing a component ===
After removing a component's configuration in Puppet (and Puppet reconfiguring apt1001), the packages and directories on apt1001 still need to be cleaned out by hand:
<pre>
root@apt1001:/srv/wikimedia# reprepro clearvanished
There are still packages in 'buster-wikimedia|thirdparty/amd-rocm25|amd64', not removing (give --delete to do so)!
Deleting vanished identifier 'buster-wikimedia|thirdparty/amd-rocm25|i386'.
Deleting vanished identifier 'buster-wikimedia|thirdparty/amd-rocm25|source'.
</pre>
As indicated above, there likely still will be packages in the tree. You can remove them by using <code>--delete</code>:
<pre>
root@apt1001:/srv/wikimedia# reprepro --delete clearvanished
Deleting vanished identifier 'buster-wikimedia|thirdparty/amd-rocm25|amd64'.
Deleting files no longer referenced...
</pre>


=== Using the ''override'' file ===
=== Using the ''override'' file ===
Line 123: Line 164:


=== Copying between distributions ===
=== Copying between distributions ===
In some cases, notably when no compiled binaries are involved, a given package build can be used for multiple distribution versions. This is for example the case with most native Wikimedia packages, which just contain (shell) scripts and can easily support multiple distribution versions (e.g. Hardy to Lucid). It would be a waste of time and work to make separate builds for these packages, so we include only one copy of these packages into multiple distributions.
In some cases it's necessary to copy a '''binary''' which gets reused in a different distribution, e.g. for Go packages (which are statically linked and usually hard/impossible to rebuild in older distro versions). Note that in contrast to typical conventions you first need to specify the destination and then the source!. In the following example we copy from stretch to buster:
 
<pre>
# reprepro [-C component/...] copy buster-wikimedia stretch-wikimedia prometheus-rsyslog-exporter
Exporting indices...
# reprepro ls prometheus-rsyslog-exporter
prometheus-rsyslog-exporter | 0.0.0+git20180118-1 |  jessie-wikimedia | amd64, source
prometheus-rsyslog-exporter | 0.0.0+git20180118-1 | stretch-wikimedia | amd64, source
prometheus-rsyslog-exporter | 0.0.0+git20180118-1 |  buster-wikimedia | amd64, source
</pre>


To do this, build the package for the ''oldest'' distribution that is supported, e.g. <code>hardy-wikimedia</code>. Then, import it into this distribution:
If you need to copy all binary packages of a source package, you can use '''copysrc''' (and give the source package name) instead.
# reprepro -C main include hardy-wikimedia wikimedia-base wikimedia-base_0.26_amd64.changes
# reprepro ls wikimedia-base
wikimedia-base | 0.26 |  hardy-wikimedia | amd64, source


Then, copy it to the other supported distributions, e.g. ''karmic-wikimedia'' and ''lucid-wikimedia'':
{{Note | if the source package is not present, reprepro won't do this copying. Below is an alternative procedure.}}
# reprepro copy karmic-wikimedia hardy-wikimedia wikimedia-base
 
# reprepro copy lucid-wikimedia hardy-wikimedia wikimedia-base
You can search for the .deb package and ''re-include'' it in the repository, in the right component:
# reprepro ls wikimedia-base
 
wikimedia-base | 0.26 |  hardy-wikimedia | amd64, source
<syntaxhighlight lang=shell-session>
wikimedia-base | 0.26 | karmic-wikimedia | amd64, source
user@apt1001:~ $ sudo find /srv/wikimedia/ -name *python-mwclient*
wikimedia-base | 0.26 |  lucid-wikimedia | amd64, source
/srv/wikimedia/pool/thirdparty/m/mwclient/python-mwclient_0.8.4-1_all.deb
 
user@apt1001:~ $ sudo -i reprepro -C <new_component> includedeb <new_repo> /srv/wikimedia/pool/thirdparty/m/mwclient/python-mwclient_0.8.4-1_all.deb
</syntaxhighlight>
 
Anyway, this alternative procedure is not common, and you should wonder why the source package is not available.


=== Updating external repositories ===
=== Updating external repositories ===
Reprepro has the ability to pull packages from other APT repositories automatically, this has added benefits like verifying signatures, easy management and so on. It is configured via <tt>conf/updates</tt> configuration file.
Reprepro has the ability to pull packages from other APT repositories automatically, this has added benefits like verifying signatures, easy management and so on. It is configured via <tt>conf/updates</tt> configuration file managed via Puppet.


To check which updates are available:
To check which updates are available:
   # reprepro --restrict cassandra checkupdate
<pre>
  Calculating packages to get...
   # reprepro --component thirdparty/tor checkupdate stretch-wikimedia
  Updates needed for 'jessie-wikimedia|thirdparty|source':
Calculating packages to get...
  'cassandra': '2.1.5' will be upgraded to '2.1.6' (from 'cassandra'):
Updates needed for 'stretch-wikimedia|thirdparty/tor|source':
  files needed: pool/thirdparty/c/cassandra/cassandra_2.1.6.dsc pool/thirdparty/c/cassandra/cassandra_2.1.6.orig.tar.gz pool/thirdparty/c/cassandra/cassandra_2.1.6.diff.gz
'tor': '0.3.3.9-1~d90.stretch+1' will be upgraded to '0.3.5.7-1~d90.stretch+1' (from 'tor-stretch'):
  ...
files needed: pool/thirdparty/tor/t/tor/tor_0.3.5.7-1~d90.stretch+1.dsc pool/thirdparty/tor/t/tor/tor_0.3.5.7.orig.tar.gz pool/thirdparty/tor/t/tor/tor_0.3.5.7-1~d90.stretch+1.diff.gz
Updates needed for 'stretch-wikimedia|thirdparty/tor|amd64':
'tor': '0.3.3.9-1~d90.stretch+1' will be upgraded to '0.3.5.7-1~d90.stretch+1' (from 'tor-stretch'):
files needed: pool/thirdparty/tor/t/tor/tor_0.3.5.7-1~d90.stretch+1_amd64.deb
'tor-dbgsym': '0.3.3.9-1~d90.stretch+1' will be upgraded to '0.3.5.7-1~d90.stretch+1' (from 'tor-stretch'):
files needed: pool/thirdparty/tor/t/tor/tor-dbgsym_0.3.5.7-1~d90.stretch+1_amd64.deb
'tor-geoipdb': '0.3.3.9-1~d90.stretch+1' will be upgraded to '0.3.5.7-1~d90.stretch+1' (from 'tor-stretch'):
files needed: pool/thirdparty/tor/t/tor/tor-geoipdb_0.3.5.7-1~d90.stretch+1_all.deb
</pre>


Remember to use <tt>--restrict</tt> to limit your update only the packages you are interested in, otherwise all available updates will be brought in!
You can also get an overview of all packages currently pending for a distro using 'checkupdate':
 
<pre>
reprepro checkupdate stretch-wikimedia
</pre>


To pull in the updates for real:
To pull in the updates for real:
  # reprepro --restrict cassandra update
<pre>
  Calculating packages to get...
# reprepro --noskipold  --component thirdparty/php72 update stretch-wikimedia
  Getting packages...
Calculating packages to get...
  Installing (and possibly deleting) packages...
Getting packages...
  Exporting indices...
Installing (and possibly deleting) packages...
  Deleting files no longer referenced...
Exporting indices...
Deleting files no longer referenced...
</pre>
 
=== Adding a new external repository ===
 
When adding a new external repository, you need to:
# add a new definition to <tt>modules/aptrepo/files/updates</tt> in puppet.git
# add the repo's ASCII public key(s) in <tt>modules/aptrepo/files/updates-keys/<LONG_KEYID>_<name>.gpg</tt>
# add the new component to <tt>aptrepo/files/distributions-wikimedia</tt>


'''NOTE:''' Sometimes <tt>--restrict</tt> will not actually update your package. For confluent-kafka-2.11, I had to instead use <tt>--component thirdparty/confluent</tt> without a <tt>--restrict</tt> option to get reprepro to actually pull the new version of the package.
See [https://gerrit.wikimedia.org/r/c/operations/puppet/+/675812 675812] for an example CR.  Once the CR is merged run the following command to preform the initial sync
<syntaxhighlight lang="console">
$ sudo -E reprepro --verbose --component  thirdparty/gitlab update buster-wikimedia
.....snip.....
aptmethod redirects 'https://packages.gitlab.com/gitlab/gitlab-ce/debian/dists/buster/main/binary-amd64/Packages.bz2' to 'https://d20rj4el6vkp4c.cloudfront.net/7/8/debian/dists/buster/main/binary-amd64/Packages.bz2?t=1618409760_6b07d0289d193c0fe2b28b704d6a5efe45eeb096'
aptmethod got 'https://d20rj4el6vkp4c.cloudfront.net/7/8/debian/dists/buster/main/binary-amd64/Packages.bz2?t=1618409760_6b07d0289d193c0fe2b28b704d6a5efe45eeb096'
Calculating packages to get...
Getting packages...
aptmethod redirects 'https://packages.gitlab.com/gitlab/gitlab-ce/debian/pool/buster/main/g/gitlab-ce/gitlab-ce_13.10.3-ce.0_amd64.deb' to 'https://d20rj4el6vkp4c.cloudfront.net/7/8/debian/package_files/97310.deb?t=1618409761_57fc9bd551c3fae0715cb4ab072ac304dcdf8d4b'
aptmethod got 'https://d20rj4el6vkp4c.cloudfront.net/7/8/debian/package_files/97310.deb?t=1618409761_57fc9bd551c3fae0715cb4ab072ac304dcdf8d4b'


In the example below, the custom component ''thirdparty/mono-project-stretch'' in the ''stretch-wikimedia'' repository is updated:
</syntaxhighlight>


<pre>
=== Updating where Pull rules are used ===
user@install1002:~$ sudo -i reprepro --noskipold -C 'thirdparty/mono-project-stretch' update stretch-wikimedia
If a Pull rule is configured between internal distributions, the ''checkpull'' and ''pull'' commands needs to be used instead to copy packages. For example, thirdparty/cloudera is pulled from an external repository to jessie-wikimedia and then a Pull rule is present for stretch-wikimedia, so an upgrade should follow this procedure: [[Analytics/Systems/Cluster/Hadoop/Administration#Updating Cloudera Packages]].
</pre>


=== If signing fails ===
=== If signing fails ===
Line 184: Line 265:
  export GNUPGHOME=/root/.gnupg
  export GNUPGHOME=/root/.gnupg
  sudo -E ...
  sudo -E ...
=== wrong reprepro base dir ===
Beware, reprepro is installed in different places, on apt* servers but also on releases* servers and they '''do not use the same reprepro base dir path'''.
Optionally you can put something like this in your .bash_profile:
<pre>
16 # set the right base dir for reprepro, depending whether it's apt.wm.org or releases.wm.org
17 if [ "$(hostname -s | cut -c 1-8)" == "releases" ]; then
18    export REPREPRO_BASE_DIR=/srv/org/wikimedia/reprepro
19 fi
20 if [ "$(hostname -s | cut -c 1-3)" == "apt" ]; then
21    export REPREPRO_BASE_DIR=/srv/wikimedia
22 fi
</pre>
You could also export the GNUPGHOME or other needed things there. Your .bash_profile can be puppetized in the '''operations/puppet''' repo under '''modules/admin/files/home'''.
=== Multiple versions of the same package ===
If you need to keep multiple versions of the same package in the repo, you will need to create versioned components, as reprepro doesn't support keeping '''foo_10-1.deb''' and '''foo_20-1.deb''' at the same time in the same component.
There should be plenty of examples on how to do this in the puppet repository.
'''TODO''': perhaps add here some concrete examples for documentation purposes.


== Additional info ==
== Additional info ==


Some additional bits you may find useful.
Some additional bits you may find useful.
=== dput ===
You can upload packages to the incoming directory using dput too. Create a config file '''~/dput.wmf.cf''':
<pre>
[wmf]
fqdn = apt1001.wikimedia.org
login = myuser
incoming = /srv/wikimedia/incoming
method = scp
</pre>
And then use the tool like this:
<syntaxhighlight lang="shell-session">
user@laptop:~/pkg$ dput -c ~/dput.wmf.cf wmf ../pyasn1_0.4.2-3~bpo9+1~wmf1_amd64.changes
Checking signature on .changes
gpg: ../pyasn1_0.4.2-3~bpo9+1~wmf1_amd64.changes: Valid signature from 68E713981D1515F8
Uploading to wmf (via scp to apt1001.wikimedia.org):
pyasn1_0.4.2-3~bpo9+1~wmf1_amd64.buildinfo                                                                100% 8146    67.9KB/s  00:00   
pypy-pyasn1_0.4.2-3~bpo9+1~wmf1_all.deb                                                                  100%  56KB 137.4KB/s  00:00   
python-pyasn1-doc_0.4.2-3~bpo9+1~wmf1_all.deb                                                            100%  112KB 181.5KB/s  00:00   
python-pyasn1_0.4.2-3~bpo9+1~wmf1_all.deb                                                                100%  56KB  93.2KB/s  00:00   
python3-pyasn1_0.4.2-3~bpo9+1~wmf1_all.deb                                                                100%  56KB 137.7KB/s  00:00   
pyasn1_0.4.2-3~bpo9+1~wmf1_amd64.changes                                                                  100% 3166    15.0KB/s  00:00   
Successfully uploaded packages.
</syntaxhighlight>


=== dupload ===
=== dupload ===


To upload .changes files with dupload you will likely need a configuration like this in '''/etc/dupload.conf''':
To upload <code>.changes</code> files with dupload you will likely need a configuration like this in '''/etc/dupload.conf''':


<pre>
<pre>
[...]
[...]
$cfg{'wmf'} = {
$cfg{'wmf'} = {
     fqdn => 'install1002.wikimedia.org',
     fqdn => 'apt1001.wikimedia.org',
     method => 'scp',
     method => 'scp',
     incoming => '/srv/wikimedia/incoming',
     incoming => '/srv/wikimedia/incoming',
Line 208: Line 344:
$ dupload --to wmf ../pkg_version_amd64.changes
$ dupload --to wmf ../pkg_version_amd64.changes
</pre>
</pre>
=== testing update filters ===
If you are working on a filter for '''modules/aptrepo/files/updates''', you can develop/test such filter locally before pushing the change to reprepro and finding out it doesn't work as expected.
Basically, download to your laptop a Packages file and use the <code>grep-dctrl</code> tool locally to see the filtered/generated file resulting of applying the filter.
<syntaxhighlight lang="shell-session">
user@debian:~$ wget https://packages.cloud.google.com/apt/dists/kubernetes-xenial/main/binary-amd64/Packages
[..]
user@debian:~$ grep-dctrl \( -P 'kubeadm' -o -P 'kubelet' -o -P 'kubectl' -a -FVersion --lt 1.16 -a -FVersion --ge 1.15 \) -o \( -P 'kubernetes-cni' -o  -P 'cri-tools' \) < Packages | less
[..]
</syntaxhighlight>


== External links ==
== External links ==
Line 214: Line 363:
[[Category:Ubuntu]]
[[Category:Ubuntu]]
[[Category:Package management]]
[[Category:Package management]]
[[Category:SRE Infrastructure Foundations]]

Revision as of 11:34, 18 November 2021

Reprepro is a tool for managing APT repositories.

Reprepro is able to manage multiple repositories for multiple distribution versions and one package pool. It can process updates from an incoming directory, copy package (references) between distribution versions, list all packages and/or package versions available in the repository, etc.

Reprepro maintains an internal database (a .DBM file) of the contents of the repository, which makes it quite fast and efficient.

It's installed from the Debian package reprepro, and is configured using the files in /srv/wikimedia/conf/. Running

  sudo -i reprepro <etc..etc>

will make your life better.

HOWTO

This section explains the most commonly needed actions/tasks involving reprepro. reprepro is running on apt1001.wikimedia.org.

Browse packages

https://apt-browser.toolforge.org/

List all package versions in the repositories

For a given package name, use

reprepro ls PACKAGE_NAME

For example:

# reprepro ls puppet
puppet | 4.8.2-5~trusty1 |  trusty-wikimedia | amd64, i386, source
puppet |  4.8.2-5~bpo8+1 |  jessie-wikimedia | amd64, i386, source
puppet |  3.8.5-2~bpo8+2 |  jessie-wikimedia | amd64, i386, source
puppet |         4.8.2-5 | stretch-wikimedia | amd64, i386, source

This shows that there are two different builds of the same package version in the repositories hardy-wikimedia and lucid-wikimedia. There is clearly no puppet package in the karmic-wikimedia repository.

To see all packages in a given distribution, use

reprepro list DISTRIBUTION_NAME

To find all packages in all repositories, use

reprepro dumpreferences

or a variant thereof (see reprepro --help or man reprepro).

Building an unmodified third-party package for import

  1. set up a Cloud VPS instance for building and select the role::package::builder role. You could use the VMs in the packaging project.
  2. Alternatively if you are ops, just log into deneb.codfw.wmnet. It is set up in exactly the same way but is faster and is already set up.
  3. fetch the package source to that instance and review it. dget, wget, and curl are all valid options; dget is probably the fastest.
  4. build the package for the target distribution using https://phabricator.wikimedia.org/diffusion/OPUP/browse/production/modules/package_builder/README.md$30
  5. copy the build to the repo server (i.e. host with the install-server roles, apt1001/2001 on 20190403). On deneb there is an rsync server that can be used directly from apt1001. Example sudo rsync -va deneb.codfw.wmnet::pbuilder-result/trusty-amd64/apertium* .
  6. proceed with reprepro as documented below

Building a patched package for import

(e.g. when backporting a package from a more recent Debian release)

  1. import the package into git (usually operations/debs/SOURCEPACKAGENAME)
  2. apply your patch(es)
  3. set up a Cloud VPS instance for building and select the role::package::builder role.
  4. Alternatively if you are ops, just logging into deneb.codfw.wmnet. It is set up in exactly the same way but is faster and is already set up
  5. check out your package on the Cloud VPS host
  6. build the package for the target distribution using https://phabricator.wikimedia.org/diffusion/OPUP/browse/production/modules/package_builder/README.md$30
  7. copy the build to the repo server (i.e. host with the install-server roles, apt1001/2001 on 20190403).On deneb there is an rsync server that can be used directly from apt1001. Example sudo rsync -va deneb.codfw.wmnet::pbuilder-result/trusty-amd64/apertium* .
  8. proceed with reprepro as documented below

Automatically import files from an incoming/ directory

Reprepro can automatically import packages from an upload directory, as long as all the package fields are setup correctly with the right distribution and component names. It's also vital that the .changes files are present. When all these conditions are met, and all these files have been uploaded to /srv/wikimedia/incoming (e.g. using dupload), you can use:

user@apt1001:~ $ sudo -i reprepro processincoming default

It uses the rules defined in the file /srv/wikimedia/conf/incoming

If the package is rejected by reprepro because one of the package control fields are wrong, or you want to override them for some other reason, use an override file (see below).

It's best to check whether the /srv/wikimedia/incoming/ directory is empty after using procesincoming, because reprepro should have moved/deleted all imported files. Any remaining files have not been processed.

After importing the packages you likely need to export the packages for them to be available via apt-get:

user@apt1001:~ $ sudo -i reprepro export

Importing packages

You will need three things:

  1. Component
    1. main: for Wikimedia native packages, as well as Debian/Ubuntu packages that have had source-modifications
    2. universe: for existing Ubuntu packages that just have been recompiled or backported for the given distribution.
  2. Distribution
    1. Usually: stretch-wikimedia, jessie-wikimedia, or trusty-wikimedia. This is the distribution that the package has been compiled for, and under. This should match the field in the package's Changelog.
    2. If your package was specifically built for Wikimedia and does not have a distribution of CODENAME-wikimedia listed in the changes file, then you should force reprepro to accept a CODENAME-wikimedia distribution. Add --ignore=wrongdistribution flag to the reprepro command to do so.
  3. Changes file
    1. If building on the packaging server, the changes file is most likely in /var/cache/pbuilder/result/CODENAME-amd64/ along with the dsc, buildinfo, tar.xz, deb, and orig.tar.gz.
    2. You will want to bring all of these files over to the reprepro server.

It's best to have reprepro fully manage all package aspects using the changes file that was created during the build of the package (e.g. using Pbuilder). When the changes file is present along with all files list therein, reprepro can handle it all with:

# reprepro -C COMPONENT include DISTRIBUTION CHANGES_FILE

For example:

# reprepro -C main include stretch-wikimedia nagios-nrpe-server_3.0.1-3+deb9u1+wmf1_amd64.changes

When no changes file is available, for example because you didn't build the package yourself, you can use includedsc and includedeb:

# reprepro -C universe includedsc lucid-wikimedia varnish_2.1.2-1.dsc
# reprepro -C universe includedeb lucid-wikimedia varnish_2.1.2-1_amd64.deb
# reprepro -C universe includedeb lucid-wikimedia libvarnish1_2.1.2-1_amd64.deb
# reprepro -C universe includedeb lucid-wikimedia libvarnish-dev_2.1.2-1_amd64.deb

Be aware that reprepro will remove older versions of packages without asking. They are no longer available in the pool (/srv/wikimedia/pool) either. However, /srv/wikimedia/ is backed up using Amanda on tridge, and many packages should be available in subversion as well.

Missing orig tarball

The original tarball should be listed in the changes file when package is built. Often times, this is accomplished with a build flag: dpkg-buildpackage -sa, debuild -sa, or pdebuild -- --debbuildopts -sa

If it is missing though, you can tell reprepro to ignore it and find it in the current working directory with the --ignore=missingfile flag.

Removing packages

A given binary package can be removed from a distribution using

# reprepro remove distribution-name package-name

For example:

# reprepro remove jessie-wikimedia facter

However, usually you want to remove all binary packages from a source package, that can be done with:

# reprepro removesrc distribution-name package-name

For example:

# reprepro removesrc trusty-wikimedia openjdk-8

Removing a package from a specific component:

# reprepro -C thirdparty/elastic65 remove stretch-wikimedia kibana

Removing a component

After removing a component's configuration in Puppet (and Puppet reconfiguring apt1001), the packages and directories on apt1001 still need to be cleaned out by hand:

root@apt1001:/srv/wikimedia# reprepro clearvanished 
There are still packages in 'buster-wikimedia|thirdparty/amd-rocm25|amd64', not removing (give --delete to do so)!
Deleting vanished identifier 'buster-wikimedia|thirdparty/amd-rocm25|i386'.
Deleting vanished identifier 'buster-wikimedia|thirdparty/amd-rocm25|source'.

As indicated above, there likely still will be packages in the tree. You can remove them by using --delete:

root@apt1001:/srv/wikimedia# reprepro --delete clearvanished 
Deleting vanished identifier 'buster-wikimedia|thirdparty/amd-rocm25|amd64'.
Deleting files no longer referenced...

Using the override file

When we are building our own packages, we should make sure that all control fields (such as the distribution name, component, priority etc.) are set correctly. Please rebuild your package if not.

However, occasionally it is necessary to override fields on a previously built package, which we don't want to modify the source of and/or rebuild. Ubuntu often does the same, and just retrieves packages from Debian Unstable and overrides a few fields using an override file.

We have an override file as well, in /srv/wikimedia/conf/deb-override. It's format is:

# packagename	fieldname	newvalue

As an example, our Varnish packages are coming straight from Debian Unstable (like in Ubuntu), and can be imported fine into Lucid as long as we override some package fields:

varnish		Distribution	lucid
libvarnish1	Distribution	lucid
libvarnish-dev	Distribution	lucid

Copying between distributions

In some cases it's necessary to copy a binary which gets reused in a different distribution, e.g. for Go packages (which are statically linked and usually hard/impossible to rebuild in older distro versions). Note that in contrast to typical conventions you first need to specify the destination and then the source!. In the following example we copy from stretch to buster:

# reprepro [-C component/...] copy buster-wikimedia stretch-wikimedia prometheus-rsyslog-exporter
Exporting indices...
# reprepro ls prometheus-rsyslog-exporter
prometheus-rsyslog-exporter | 0.0.0+git20180118-1 |  jessie-wikimedia | amd64, source
prometheus-rsyslog-exporter | 0.0.0+git20180118-1 | stretch-wikimedia | amd64, source
prometheus-rsyslog-exporter | 0.0.0+git20180118-1 |  buster-wikimedia | amd64, source

If you need to copy all binary packages of a source package, you can use copysrc (and give the source package name) instead.

You can search for the .deb package and re-include it in the repository, in the right component:

user@apt1001:~ $ sudo find /srv/wikimedia/ -name *python-mwclient*
/srv/wikimedia/pool/thirdparty/m/mwclient/python-mwclient_0.8.4-1_all.deb

user@apt1001:~ $ sudo -i reprepro -C <new_component> includedeb <new_repo> /srv/wikimedia/pool/thirdparty/m/mwclient/python-mwclient_0.8.4-1_all.deb

Anyway, this alternative procedure is not common, and you should wonder why the source package is not available.

Updating external repositories

Reprepro has the ability to pull packages from other APT repositories automatically, this has added benefits like verifying signatures, easy management and so on. It is configured via conf/updates configuration file managed via Puppet.

To check which updates are available:

  # reprepro --component thirdparty/tor checkupdate stretch-wikimedia
Calculating packages to get...
Updates needed for 'stretch-wikimedia|thirdparty/tor|source':
'tor': '0.3.3.9-1~d90.stretch+1' will be upgraded to '0.3.5.7-1~d90.stretch+1' (from 'tor-stretch'):
 files needed: pool/thirdparty/tor/t/tor/tor_0.3.5.7-1~d90.stretch+1.dsc pool/thirdparty/tor/t/tor/tor_0.3.5.7.orig.tar.gz pool/thirdparty/tor/t/tor/tor_0.3.5.7-1~d90.stretch+1.diff.gz
 Updates needed for 'stretch-wikimedia|thirdparty/tor|amd64':
'tor': '0.3.3.9-1~d90.stretch+1' will be upgraded to '0.3.5.7-1~d90.stretch+1' (from 'tor-stretch'):
 files needed: pool/thirdparty/tor/t/tor/tor_0.3.5.7-1~d90.stretch+1_amd64.deb
'tor-dbgsym': '0.3.3.9-1~d90.stretch+1' will be upgraded to '0.3.5.7-1~d90.stretch+1' (from 'tor-stretch'):
 files needed: pool/thirdparty/tor/t/tor/tor-dbgsym_0.3.5.7-1~d90.stretch+1_amd64.deb
'tor-geoipdb': '0.3.3.9-1~d90.stretch+1' will be upgraded to '0.3.5.7-1~d90.stretch+1' (from 'tor-stretch'):
 files needed: pool/thirdparty/tor/t/tor/tor-geoipdb_0.3.5.7-1~d90.stretch+1_all.deb

You can also get an overview of all packages currently pending for a distro using 'checkupdate':

reprepro checkupdate stretch-wikimedia

To pull in the updates for real:

 # reprepro --noskipold  --component thirdparty/php72 update stretch-wikimedia
Calculating packages to get...
Getting packages...
Installing (and possibly deleting) packages...
Exporting indices...
Deleting files no longer referenced...

Adding a new external repository

When adding a new external repository, you need to:

  1. add a new definition to modules/aptrepo/files/updates in puppet.git
  2. add the repo's ASCII public key(s) in modules/aptrepo/files/updates-keys/<LONG_KEYID>_<name>.gpg
  3. add the new component to aptrepo/files/distributions-wikimedia

See 675812 for an example CR. Once the CR is merged run the following command to preform the initial sync

$ sudo -E reprepro --verbose --component  thirdparty/gitlab update buster-wikimedia 
.....snip.....
aptmethod redirects 'https://packages.gitlab.com/gitlab/gitlab-ce/debian/dists/buster/main/binary-amd64/Packages.bz2' to 'https://d20rj4el6vkp4c.cloudfront.net/7/8/debian/dists/buster/main/binary-amd64/Packages.bz2?t=1618409760_6b07d0289d193c0fe2b28b704d6a5efe45eeb096'
aptmethod got 'https://d20rj4el6vkp4c.cloudfront.net/7/8/debian/dists/buster/main/binary-amd64/Packages.bz2?t=1618409760_6b07d0289d193c0fe2b28b704d6a5efe45eeb096'
Calculating packages to get...
Getting packages...
aptmethod redirects 'https://packages.gitlab.com/gitlab/gitlab-ce/debian/pool/buster/main/g/gitlab-ce/gitlab-ce_13.10.3-ce.0_amd64.deb' to 'https://d20rj4el6vkp4c.cloudfront.net/7/8/debian/package_files/97310.deb?t=1618409761_57fc9bd551c3fae0715cb4ab072ac304dcdf8d4b'
aptmethod got 'https://d20rj4el6vkp4c.cloudfront.net/7/8/debian/package_files/97310.deb?t=1618409761_57fc9bd551c3fae0715cb4ab072ac304dcdf8d4b'

Updating where Pull rules are used

If a Pull rule is configured between internal distributions, the checkpull and pull commands needs to be used instead to copy packages. For example, thirdparty/cloudera is pulled from an external repository to jessie-wikimedia and then a Pull rule is present for stretch-wikimedia, so an upgrade should follow this procedure: Analytics/Systems/Cluster/Hadoop/Administration#Updating Cloudera Packages.

If signing fails

If you get errors such as:

Error: gpgme created no signature!
This most likely means gpg is confused or produces some error libgpgme is
not able to understand.

Make sure you are running as root AND you have exported its environment variables to point to the right gpg keyring (at least until other method of signing is setup). This usually means:

sudo -i

or, better,

export REPREPRO_BASE_DIR=/srv/wikimedia
export GNUPGHOME=/root/.gnupg
sudo -E ...


wrong reprepro base dir

Beware, reprepro is installed in different places, on apt* servers but also on releases* servers and they do not use the same reprepro base dir path.

Optionally you can put something like this in your .bash_profile:

 16 # set the right base dir for reprepro, depending whether it's apt.wm.org or releases.wm.org
 17 if [ "$(hostname -s | cut -c 1-8)" == "releases" ]; then
 18     export REPREPRO_BASE_DIR=/srv/org/wikimedia/reprepro
 19 fi
 20 if [ "$(hostname -s | cut -c 1-3)" == "apt" ]; then
 21     export REPREPRO_BASE_DIR=/srv/wikimedia
 22 fi

You could also export the GNUPGHOME or other needed things there. Your .bash_profile can be puppetized in the operations/puppet repo under modules/admin/files/home.

Multiple versions of the same package

If you need to keep multiple versions of the same package in the repo, you will need to create versioned components, as reprepro doesn't support keeping foo_10-1.deb and foo_20-1.deb at the same time in the same component.

There should be plenty of examples on how to do this in the puppet repository.

TODO: perhaps add here some concrete examples for documentation purposes.

Additional info

Some additional bits you may find useful.

dput

You can upload packages to the incoming directory using dput too. Create a config file ~/dput.wmf.cf:

[wmf]
fqdn = apt1001.wikimedia.org
login = myuser
incoming = /srv/wikimedia/incoming
method = scp

And then use the tool like this:

user@laptop:~/pkg$ dput -c ~/dput.wmf.cf wmf ../pyasn1_0.4.2-3~bpo9+1~wmf1_amd64.changes 
Checking signature on .changes
gpg: ../pyasn1_0.4.2-3~bpo9+1~wmf1_amd64.changes: Valid signature from 68E713981D1515F8
Uploading to wmf (via scp to apt1001.wikimedia.org):
pyasn1_0.4.2-3~bpo9+1~wmf1_amd64.buildinfo                                                                100% 8146    67.9KB/s   00:00    
pypy-pyasn1_0.4.2-3~bpo9+1~wmf1_all.deb                                                                   100%   56KB 137.4KB/s   00:00    
python-pyasn1-doc_0.4.2-3~bpo9+1~wmf1_all.deb                                                             100%  112KB 181.5KB/s   00:00    
python-pyasn1_0.4.2-3~bpo9+1~wmf1_all.deb                                                                 100%   56KB  93.2KB/s   00:00    
python3-pyasn1_0.4.2-3~bpo9+1~wmf1_all.deb                                                                100%   56KB 137.7KB/s   00:00    
pyasn1_0.4.2-3~bpo9+1~wmf1_amd64.changes                                                                  100% 3166    15.0KB/s   00:00    
Successfully uploaded packages.

dupload

To upload .changes files with dupload you will likely need a configuration like this in /etc/dupload.conf:

[...]
$cfg{'wmf'} = {
    fqdn => 'apt1001.wikimedia.org',
    method => 'scp',
    incoming => '/srv/wikimedia/incoming',
    login => 'yourusername',
};
[...]

And then, you will be able to upload to incoming using a command like:

$ dupload --to wmf ../pkg_version_amd64.changes

testing update filters

If you are working on a filter for modules/aptrepo/files/updates, you can develop/test such filter locally before pushing the change to reprepro and finding out it doesn't work as expected.

Basically, download to your laptop a Packages file and use the grep-dctrl tool locally to see the filtered/generated file resulting of applying the filter.

user@debian:~$ wget https://packages.cloud.google.com/apt/dists/kubernetes-xenial/main/binary-amd64/Packages
[..]
user@debian:~$ grep-dctrl \( -P 'kubeadm' -o -P 'kubelet' -o -P 'kubectl' -a -FVersion --lt 1.16 -a -FVersion --ge 1.15 \) -o \( -P 'kubernetes-cni' -o  -P 'cri-tools' \) < Packages | less
[..]

External links