+2008-01-07 Joerg Jaspert <joerg@debian.org>
+
+ * dak/examine_package.py (check_deb): Remove linda call. It
+ provides no added benefit to lintian anymore.
+
+2008-01-06 Joerg Jaspert <joerg@debian.org>
+
+ * dak/examine_package.py (do_lintian): lintian now supports html
+ coloring, so use it.
+ (do_command): Dont escape html chars if param escaped = 1
+
+2007-12-31 Anthony Towns <ajt@debian.org>
+
+ * dak/process_new.py (recheck): pass "" for prefix_str to reject()
+ when processing result of check_dsc_against_db so we don't promote
+ warnings to rejections.
+
+2007-12-30 Joerg Jaspert <joerg@debian.org>
+
+ * dak/dak.py (init): add show-new. This is based on a patch
+ submitted by Thomas Viehmann in Bug #408318, but large parts of
+ handling it are rewritten and show-new is done by me.
+
+ * dak/queue_report.py (table_row): Add link to generated html page
+ for NEW package.
+
+ * dak/show_new.py: new file, generates html overview for NEW
+ packages, similar to what we see with examine-package.
+
+ * config/debian/cron.hourly: Add show-new call
+
+ * config/debian/dak.conf: Add HTMLPath for Show-New
+
+ * dak/examine_package.py (print_copyright): ignore stderr when
+ finding copyright file.
+ (main): add html option
+ (html_escape): new function
+ (escape_if_needed): ditto
+ (headline): ditto
+ (colour_output): ditto
+ (print_escaped_text): ditto
+ (print_formatted_text): ditto
+ - use those functions everywhere where we generate output, as they
+ easily know if we want html or not and just DTRT
+ (do_lintian): new function
+ (check_deb): use it
+ (output_deb_info): Use print_escaped_text, not print_formatted_text.
+ Also import daklib.queue, determine_new now lives there
+
+ Also add a variable to see if we want html output. Default is
+ disabled, show_new enables it for its use.
+ Most of html, besides header/footer are in examine_package instead
+ of show_new, as it makes it a whole lot easier to deal with it at
+ the point the info is generated.
+
+
+ * dak/process_new.py (determine_new): Moved out of here.
+ (check_valid): Moved out of here.
+ (get_type): Moved out of here.
+
+ * daklib/queue.py (determine_new): Moved here.
+ (check_valid): Moved here.
+ (get_type): Moved here.
+
+ * dak/init_db.py (do_section): Remove non-US code
+
+ * dak/make_overrides.py (main): ditto
+
+ * dak/process_new.py (determine_new): ditto
+
+ * daklib/queue.py (Upload.in_override_p),
+ (Upload.check_override): ditto
+
+ * daklib/utils.py (extract_component_from_section):,
+ (poolify): ditto
+
+ * dak/import_archive.py (update_section): ditto
+
+ * dak/symlink_dists.py (fix_component_section): ditto
+
+ * scripts/debian/mkmaintainers: ditto
+
+ * scripts/debian/update-mirrorlists (masterlist): ditto
+
+ * config/debian-non-US/*: Remove subdir
+
+ * scripts/debian/update-readmenonus: Removed.
+
+
+2007-12-28 Anthony Towns <ajt@debian.org>
+
+ * daklib/utils.py (check_signature): add NOTATION_DATA and
+ NOTATION_NAME to known keywords.
+
+ * daklib/queue.py (Upload.check_source_against_db):
+
+ * dak/make_suite_file_list.py: add -f/--force option.
+
+ * dak/generate_releases.py: add -a/--apt-conf=FILE and
+ -f/--force-touch options. Pull version info from the database.
+ Make suite description optional.
+
+ * config/debian/dak.conf: update
+ Reject-Proposed-Updates::MoreInfoURL. Comment out
+ Suite::Stable::Version and ::Description.
+
+ * config/debian/apt.conf: Add hurd-i386 to unstable
+ debian-installer stanza.
+
+2007-12-28 Joerg Jaspert <joerg@debian.org>
+
+ * KEYEXPIRED is actually a known keyword. We do check it earlier
+ on and reject in case the sig is bad (or unknown)
+
+2007-12-24 Joerg Jaspert <joerg@debian.org>
+
+ * Also run lintian on the .dsc file to check the source itself.
+
+ * Fix the direct usage of ar | tar etc to get the copyright file
+ and use dpkg-deb, which is made for this and makes us able to
+ process data.tar.bz2 (or whatever format it will be in the
+ future).
+
+2007-12-21 Joerg Jaspert <joerg@debian.org>
+
+ * Remove the (now useless) check for a pre-depends on dpkg for
+ binaries that contain bzip2 compressed data tarballs.
+
+2007-08-28 Anthony Towns <ajt@debian.org>
+
+ * process_unchecked.py: Add support for automatic BYHAND
+ processing.
+ * config/debian/dak.conf, scripts/debian/byhand-tag: Automatic
+ processing of tag-overrides.
+ * examine_package.py: Summarise duplicate copyright file entries
+ (same md5sum) with a reference to the previous instance, rather
+ than repeating them.
+ * process_new.py: When rejecting from the p-u-new or o-p-u-new
+ holding queues, don't worry if dak has its own reasons for
+ rejecting the package as well as the SRMs.
+
+2007-06-19 Anthony Towns <ajt@debian.org>
+
+ * Add nm.debian.org pseudopackage
+
+2007-06-18 Anthony Towns <ajt@debian.org>
+
+ * daklib/logging.py: Set umask to not exclude group-writability
+ so we don't get reminded at the start of each month. Thanks to
+ Random J.
+ * dak/override.py: More changes from Herr von Wifflepuck: warn
+ if section of source is different to binary section; restore
+ functionality on source-only overrides; croak if trying to set
+ priority of a source override; never set priority of source
+ overrides; correct typo in logging (s/priority/section/ at
+ one place)
+
+ * config/debian/apt.conf.oldstable: Added for oldstable point releases.
+ * config/debian/cron.daily: auotmatically accept/reject
+ oldstable-proposed-updates based on COMMENTS directory
+
+2007-06-18 Anthony Towns <ajt@debian.org>
+
+ * config/debian/apt.conf, config/debian/apt.conf.stable,
+ config/debian/dak.conf: update for 4.0r0 (etch), and 3.1r6
+ (sarge), support for oldstable-proposed-updates, dropping m68k
+ from etch, creating etch-m68k suite, creating lenny.
+
+ * config/debian/vars: update for lenny
+
+ * config/debian/dak.conf: typo fix for Dinstall::GPGKeyring,
+ drop upload limitations, add release postgres user
+
+ * dak/process_new.py: support for automatically accepting and rejecting
+ packages from proposed-updates holding queues via COMMENTS directory
+ * cron.daily: automatically process COMMENTS-based approvals
+ and rejections for proposed-updates holding queues
+
+ * dak/process_unchecked.py: add support for oldproposedupdates
+ holding queue
+
+ * dak/control_suite.py: allow control-suite to work with etch-m68k
+
+ * dak/generate_releases.py: unlink old Release files before updating
+ them if nlinks > 1 (ie, if two files used to be the same, maybe they
+ shouldn't be when generate-releases is run)
+
+ * dak/generate_releases.py: add a couple of commented lines to make
+ it easier to deal with point releases
+
+ * dak/make_overrides.py: generate overrides for !contrib udebs
+
+ * docs/README.stable-point-release: update docs for doing a
+ point release
+
+2007-03-05 Anthony Towns <ajt@debian.org>
+
+ * config/debian/dak.conf: update for 3.1r5.
+ * scripts/debian/ssh-move: add ssh-move script from debbugs
+ * config/debian/cron.unchecked: push version info to debbugs using
+ ssh-move.
+
+2007-02-14 James Troup <troup@ries.debian.org>
+
+ * docs/README.config: remove Dinstall::GroupOverrideFilename.
+ * config/debian/dak.conf: likewise.
+ * config/debian-non-US/dak.conf: likewise.
+ * config/debian-security/dak.conf: likewise.
+
+ * daklib/queue.py (Upload.close_bugs): no longer handle NMUs or
+ experimental differently, just close the bugs and let version
+ tracking sort it out.
+ (nmu_p): remove entire class - now unused.
+ (Upload.__init__): don't use nmu_p.
+
+2007-02-08 Anthony Towns <ajt@debian.org>
+
+ * config/debian/dak.conf: update for 3.1r4. Use new 'etch'
+ signing key. Drop maximum index diffs down to 14.
+
+ * config/debian/apt.conf: add udeb support for non-free (testing,
+ unstable) and experimental.
+ * config/debian/dak.conf: likewise.
+
+ * dak/generate_releases.py (main): handle udebs in any component.
+
+ * daklib/queue.py (Upload.build_summaries): handle files without a
+ 'type' gracefully.
+
+ * dak/generate_releases.py (print_sha256_files): new function.
+ (main): use it.
+
+ * dak/process_accepted.py (stable_install): fix name of template
+ mail.
+
+ * dak/process_unchecked.py (is_stableupdate): fix invocation of
+ database.get_suite_id().
+
+ * templates/process-new.bxa_notification: Update on request
+ of/after discussion with BIS staff.
+
+ * scripts/debian/mkfilesindices: also handle proposed-updates.
+
+2007-02-08 Ryan Murray <rmurray@debian.org>
+
+ * config/debian/cron.monthly: use $ftpgroup instead of hardcoding
+ group name for chgrp of mail archives.
+
+ * daklib/queue.py (Upload.check_dsc_against_db): handle multiple
+ orig.tar.gz's by picking the first one by file id.
+
+ * dak/override.py (main): limit to binary overrides only for now.
+ (usage): update to match.
+
+ * config/debian/cron.daily: track when we have the accepted lock
+ and clean it up on exit if we have it. Take/check the
+ cron.unchecked lock just before traping to cleanup on exit.
+ Remove potato override handling. Remove any dangling symlinks in
+ /srv/incoming.d.o/buildd. Clean up apt-ftparchive's databases.
+
+ * config/debian/apt.conf: change default compression scheme for
+ both Sources and Packages to gzip and bzip2 rather than
+ uncompressed and gzip (Packages) and gzip (Sources). Use old
+ defaults for proposed-updates.
+
+ * dak/control_overrides.py (main): refuse to operate on
+ untouchable suites.
+
+ * config/debian/pseudo-packages.maintainers: drop install,
+ installation, boot-floppy, slink-cd, potato-cd and
+ nonus.debian.org. Update base.
+ * config/debian/pseudo-packages.description: likewise.
+
+ * daklib/utils.py (re_srchasver): new regex.
+ (parse_changes): use regex to split 'Source (Version)' style
+ Source fields into 'source' and 'source-version'.
+
+ * config/debian/cron.daily: use $base instead of hardcoding path
+ name.
+
+ * scripts/debian/mkfilesindices: source 'vars' file and use it's
+ variables instead of hardcoding path names.
+
+ * config/debian/apt.conf: switch from /org to /srv.
+ * config/debian/apt.conf.buildd: likewise.
+ * config/debian/apt.conf.stable: likewise.
+ * config/debian/cron.daily: likewise.
+ * config/debian/cron.hourly: likewise.
+ * config/debian/cron.monthly: likewise.
+ * config/debian/cron.unchecked: likewise.
+ * config/debian/cron.weekly: likewise.
+ * config/debian/dak.conf: likewise.
+ * config/debian/vars: likewise.
+ * scripts/debian/mkfilesindices: likewise.
+
+2007-02-08 James Troup <james@nocrew.org>
+
+ * dak/process_unchecked.py (check_signed_by_key): new function to
+ ensure .changes files are signed by an authorized uploader.
+ (process_it): use it.
+
+ * config/debian/dak.conf (Binary-Upload-Restrictions): new stanza
+ to configure per suite/component/architecture binary upload
+ restrictions.
+
+2006-10-09 James Troup <james.troup@canonical.com>
+
+ * dak/process_unchecked.py (check_timestamps): change match to
+ search as recent versions of python-apt prefix the string with 'E: '.
+
+2006-06-26 Ryan Murray <rmurray@debian.org>
+
+ * dak/process_unchecked.py (check_files): strip optional source version
+ from Source: field in changes file, and ensure what is left is a valid
+ package name.
+
+2006-06-23 Ryan Murray <rmurray@debian.org>
+
+ * dak/process_unchecked.py (check_files): also check ProposedUpdates
+ queue for source.
+
+2006-06-18 Ryan Murray <rmurray@debian.org>
+
+ * dak/scripts/debian/update-ftpstats: look for dak named processes in
+ the log, too.
+
+ * dak/process_unchecked.py (check_files): only check embargoed and
+ unembargoed queues if the keys are set.
+
+ * dak/config/debian-security/apt.conf: set Packages::Compress to gzip
+ and bzip2 for etch.
+
2006-06-16 James Troup <james@nocrew.org>
* dak/dak.py (init): add new-security-install.
Dir
{
- ArchiveDir "/org/ftp.debian.org/ftp/";
- OverrideDir "/org/ftp.debian.org/scripts/override/";
- CacheDir "/org/ftp.debian.org/database/";
+ ArchiveDir "/srv/ftp.debian.org/ftp/";
+ OverrideDir "/srv/ftp.debian.org/scripts/override/";
+ CacheDir "/srv/ftp.debian.org/database/";
};
Default
{
- Packages::Compress ". gzip";
- Sources::Compress "gzip";
+ Packages::Compress "gzip bzip2";
+ Sources::Compress "gzip bzip2";
Contents::Compress "gzip";
DeLinkLimit 0;
MaxContentsChange 25000;
TreeDefault
{
- Contents::Header "/org/ftp.debian.org/dak/config/debian/Contents.top";
+ Contents::Header "/srv/ftp.debian.org/dak/config/debian/Contents.top";
};
-tree "dists/proposed-updates"
+tree "dists/oldstable-proposed-updates"
{
- FileList "/org/ftp.debian.org/database/dists/proposed-updates_$(SECTION)_binary-$(ARCH).list";
- SourceFileList "/org/ftp.debian.org/database/dists/proposed-updates_$(SECTION)_source.list";
+ FileList "/srv/ftp.debian.org/database/dists/oldstable-proposed-updates_$(SECTION)_binary-$(ARCH).list";
+ SourceFileList "/srv/ftp.debian.org/database/dists/oldstable-proposed-updates_$(SECTION)_source.list";
Sections "main contrib non-free";
Architectures "alpha arm hppa i386 ia64 m68k mips mipsel powerpc s390 sparc source";
BinOverride "override.sarge.$(SECTION)";
Contents " ";
};
-tree "dists/testing"
+tree "dists/proposed-updates"
{
- FakeDI "dists/unstable";
- FileList "/org/ftp.debian.org/database/dists/testing_$(SECTION)_binary-$(ARCH).list";
- SourceFileList "/org/ftp.debian.org/database/dists/testing_$(SECTION)_source.list";
+ FileList "/srv/ftp.debian.org/database/dists/proposed-updates_$(SECTION)_binary-$(ARCH).list";
+ SourceFileList "/srv/ftp.debian.org/database/dists/proposed-updates_$(SECTION)_source.list";
Sections "main contrib non-free";
- Architectures "alpha amd64 arm hppa i386 ia64 m68k mips mipsel powerpc s390 sparc source";
+ Architectures "alpha amd64 arm hppa i386 ia64 mips mipsel powerpc s390 sparc source";
BinOverride "override.etch.$(SECTION)";
ExtraOverride "override.etch.extra.$(SECTION)";
SrcOverride "override.etch.$(SECTION).src";
- Packages::Compress "gzip bzip2";
- Sources::Compress "gzip bzip2";
+ Contents " ";
+};
+
+tree "dists/testing"
+{
+ FakeDI "dists/unstable";
+ FileList "/srv/ftp.debian.org/database/dists/testing_$(SECTION)_binary-$(ARCH).list";
+ SourceFileList "/srv/ftp.debian.org/database/dists/testing_$(SECTION)_source.list";
+ Sections "main contrib non-free";
+ Architectures "alpha amd64 arm hppa i386 ia64 mips mipsel powerpc s390 sparc source";
+ BinOverride "override.lenny.$(SECTION)";
+ ExtraOverride "override.lenny.extra.$(SECTION)";
+ SrcOverride "override.lenny.$(SECTION).src";
};
tree "dists/testing-proposed-updates"
{
- FileList "/org/ftp.debian.org/database/dists/testing-proposed-updates_$(SECTION)_binary-$(ARCH).list";
- SourceFileList "/org/ftp.debian.org/database/dists/testing-proposed-updates_$(SECTION)_source.list";
+ FileList "/srv/ftp.debian.org/database/dists/testing-proposed-updates_$(SECTION)_binary-$(ARCH).list";
+ SourceFileList "/srv/ftp.debian.org/database/dists/testing-proposed-updates_$(SECTION)_source.list";
Sections "main contrib non-free";
- Architectures "alpha amd64 arm hppa i386 ia64 m68k mips mipsel powerpc s390 sparc source";
- BinOverride "override.etch.$(SECTION)";
- ExtraOverride "override.etch.extra.$(SECTION)";
- SrcOverride "override.etch.$(SECTION).src";
+ Architectures "alpha amd64 arm hppa i386 ia64 mips mipsel powerpc s390 sparc source";
+ BinOverride "override.lenny.$(SECTION)";
+ ExtraOverride "override.lenny.extra.$(SECTION)";
+ SrcOverride "override.lenny.$(SECTION).src";
Contents " ";
};
tree "dists/unstable"
{
- FileList "/org/ftp.debian.org/database/dists/unstable_$(SECTION)_binary-$(ARCH).list";
- SourceFileList "/org/ftp.debian.org/database/dists/unstable_$(SECTION)_source.list";
+ FileList "/srv/ftp.debian.org/database/dists/unstable_$(SECTION)_binary-$(ARCH).list";
+ SourceFileList "/srv/ftp.debian.org/database/dists/unstable_$(SECTION)_source.list";
Sections "main contrib non-free";
Architectures "alpha amd64 arm hppa hurd-i386 i386 ia64 mips mipsel m68k powerpc s390 sparc source";
BinOverride "override.sid.$(SECTION)";
ExtraOverride "override.sid.extra.$(SECTION)";
SrcOverride "override.sid.$(SECTION).src";
- Packages::Compress "gzip bzip2";
- Sources::Compress "gzip bzip2";
};
// debian-installer
-tree "dists/proposed-updates/main"
+tree "dists/oldstable-proposed-updates/main"
{
- FileList "/org/ftp.debian.org/database/dists/proposed-updates_main_$(SECTION)_binary-$(ARCH).list";
+ FileList "/srv/ftp.debian.org/database/dists/oldstable-proposed-updates_main_$(SECTION)_binary-$(ARCH).list";
Sections "debian-installer";
Architectures "alpha arm hppa i386 ia64 m68k mips mipsel powerpc s390 sparc";
BinOverride "override.sarge.main.$(SECTION)";
Contents " ";
};
-tree "dists/testing/main"
+tree "dists/proposed-updates/main"
{
- FileList "/org/ftp.debian.org/database/dists/testing_main_$(SECTION)_binary-$(ARCH).list";
+ FileList "/srv/ftp.debian.org/database/dists/proposed-updates_main_$(SECTION)_binary-$(ARCH).list";
Sections "debian-installer";
- Architectures "alpha amd64 arm hppa i386 ia64 m68k mips mipsel powerpc s390 sparc";
+ Architectures "alpha amd64 arm hppa i386 ia64 mips mipsel powerpc s390 sparc";
BinOverride "override.etch.main.$(SECTION)";
SrcOverride "override.etch.main.src";
BinCacheDB "packages-debian-installer-$(ARCH).db";
Packages::Extensions ".udeb";
+ Contents " ";
+};
+
+tree "dists/testing/main"
+{
+ FileList "/srv/ftp.debian.org/database/dists/testing_main_$(SECTION)_binary-$(ARCH).list";
+ Sections "debian-installer";
+ Architectures "alpha amd64 arm hppa i386 ia64 mips mipsel powerpc s390 sparc";
+ BinOverride "override.lenny.main.$(SECTION)";
+ SrcOverride "override.lenny.main.src";
+ BinCacheDB "packages-debian-installer-$(ARCH).db";
+ Packages::Extensions ".udeb";
Contents "$(DIST)/../Contents-udeb";
};
+tree "dists/testing/non-free"
+{
+ FileList "/srv/ftp.debian.org/database/dists/testing_non-free_$(SECTION)_binary-$(ARCH).list";
+ Sections "debian-installer";
+ Architectures "alpha amd64 arm hppa i386 ia64 mips mipsel powerpc s390 sparc";
+ BinOverride "override.lenny.main.$(SECTION)";
+ SrcOverride "override.lenny.main.src";
+ BinCacheDB "packages-debian-installer-$(ARCH).db";
+ Packages::Extensions ".udeb";
+ Contents "$(DIST)/../Contents-udeb-nf";
+};
+
tree "dists/testing-proposed-updates/main"
{
- FileList "/org/ftp.debian.org/database/dists/testing-proposed-updates_main_$(SECTION)_binary-$(ARCH).list";
+ FileList "/srv/ftp.debian.org/database/dists/testing-proposed-updates_main_$(SECTION)_binary-$(ARCH).list";
Sections "debian-installer";
- Architectures "alpha amd64 arm hppa i386 ia64 m68k mips mipsel powerpc s390 sparc";
- BinOverride "override.etch.main.$(SECTION)";
- SrcOverride "override.etch.main.src";
+ Architectures "alpha amd64 arm hppa i386 ia64 mips mipsel powerpc s390 sparc";
+ BinOverride "override.lenny.main.$(SECTION)";
+ SrcOverride "override.lenny.main.src";
BinCacheDB "packages-debian-installer-$(ARCH).db";
Packages::Extensions ".udeb";
Contents " ";
tree "dists/unstable/main"
{
- FileList "/org/ftp.debian.org/database/dists/unstable_main_$(SECTION)_binary-$(ARCH).list";
+ FileList "/srv/ftp.debian.org/database/dists/unstable_main_$(SECTION)_binary-$(ARCH).list";
Sections "debian-installer";
Architectures "alpha amd64 arm hppa hurd-i386 i386 ia64 mips mipsel m68k powerpc s390 sparc";
BinOverride "override.sid.main.$(SECTION)";
Contents "$(DIST)/../Contents-udeb";
};
+tree "dists/unstable/non-free"
+{
+ FileList "/srv/ftp.debian.org/database/dists/unstable_non-free_$(SECTION)_binary-$(ARCH).list";
+ Sections "debian-installer";
+ Architectures "alpha amd64 arm hppa hurd-i386 i386 ia64 mips mipsel m68k powerpc s390 sparc";
+ BinOverride "override.sid.main.$(SECTION)";
+ SrcOverride "override.sid.main.src";
+ BinCacheDB "packages-debian-installer-$(ARCH).db";
+ Packages::Extensions ".udeb";
+ Contents "$(DIST)/../Contents-udeb-nf";
+};
+
+tree "dists/experimental/main"
+{
+ FileList "/srv/ftp.debian.org/database/dists/experimental_main_$(SECTION)_binary-$(ARCH).list";
+ Sections "debian-installer";
+ Architectures "alpha amd64 arm hppa i386 ia64 mips mipsel m68k powerpc s390 sparc";
+ BinOverride "override.sid.main.$(SECTION)";
+ SrcOverride "override.sid.main.src";
+ BinCacheDB "packages-debian-installer-$(ARCH).db";
+ Packages::Extensions ".udeb";
+ Contents "$(DIST)/../Contents-udeb";
+};
+
+tree "dists/experimental/non-free"
+{
+ FileList "/srv/ftp.debian.org/database/dists/experimental_non-free_$(SECTION)_binary-$(ARCH).list";
+ Sections "debian-installer";
+ Architectures "alpha amd64 arm hppa hurd-i386 i386 ia64 mips mipsel m68k powerpc s390 sparc";
+ BinOverride "override.sid.main.$(SECTION)";
+ SrcOverride "override.sid.main.src";
+ BinCacheDB "packages-debian-installer-$(ARCH).db";
+ Packages::Extensions ".udeb";
+ Contents "$(DIST)/../Contents-udeb-nf";
+};
+
// Experimental
tree "dists/experimental"
{
- FileList "/org/ftp.debian.org/database/dists/experimental_$(SECTION)_binary-$(ARCH).list";
- SourceFileList "/org/ftp.debian.org/database/dists/experimental_$(SECTION)_source.list";
+ FileList "/srv/ftp.debian.org/database/dists/experimental_$(SECTION)_binary-$(ARCH).list";
+ SourceFileList "/srv/ftp.debian.org/database/dists/experimental_$(SECTION)_source.list";
Sections "main contrib non-free";
Architectures "alpha amd64 arm hppa hurd-i386 i386 ia64 mips mipsel m68k powerpc s390 sparc source";
BinOverride "override.sid.$(SECTION)";
SrcOverride "override.sid.$(SECTION).src";
Contents " ";
};
+
+tree "dists/etch-m68k"
+{
+ FakeDI "dists/unstable";
+ FileList "/srv/ftp.debian.org/database/dists/etch-m68k_$(SECTION)_binary-$(ARCH).list";
+ SourceFileList "/srv/ftp.debian.org/database/dists/etch-m68k_$(SECTION)_source.list";
+ Sections "main contrib non-free";
+ Architectures "m68k source";
+ BinOverride "override.etch.$(SECTION)";
+ ExtraOverride "override.etch.extra.$(SECTION)";
+ SrcOverride "override.etch.$(SECTION).src";
+};
+
+tree "dists/etch-m68k/main"
+{
+ FileList "/srv/ftp.debian.org/database/dists/etch-m68k_main_$(SECTION)_binary-$(ARCH).list";
+ Sections "debian-installer";
+ Architectures "m68k";
+ BinOverride "override.etch.main.$(SECTION)";
+ SrcOverride "override.etch.main.src";
+ BinCacheDB "packages-debian-installer-$(ARCH).db";
+ Packages::Extensions ".udeb";
+ Contents "$(DIST)/../Contents-udeb";
+};
+
+tree "dists/etch-m68k/non-free"
+{
+ FileList "/srv/ftp.debian.org/database/dists/etch-m68k_non-free_$(SECTION)_binary-$(ARCH).list";
+ Sections "debian-installer";
+ Architectures "m68k";
+ BinOverride "override.etch.main.$(SECTION)";
+ SrcOverride "override.etch.main.src";
+ BinCacheDB "packages-debian-installer-$(ARCH).db";
+ Packages::Extensions ".udeb";
+ Contents "$(DIST)/../Contents-udeb-nf";
+};
Dir
{
- ArchiveDir "/org/incoming.debian.org/buildd/";
- OverrideDir "/org/ftp.debian.org/scripts/override/";
- CacheDir "/org/ftp.debian.org/database/";
+ ArchiveDir "/srv/incoming.debian.org/buildd/";
+ OverrideDir "/srv/ftp.debian.org/scripts/override/";
+ CacheDir "/srv/ftp.debian.org/database/";
};
Default
{
- Packages::Compress "gzip";
- Sources::Compress "gzip";
+ Packages::Compress "bzip2 gzip";
+ Sources::Compress "bzip2 gzip";
DeLinkLimit 0;
FileMode 0664;
}
BinOverride "override.sid.all3";
BinCacheDB "packages-accepted.db";
- FileList "/org/ftp.debian.org/database/dists/unstable_accepted.list";
+ FileList "/srv/ftp.debian.org/database/dists/unstable_accepted.list";
PathPrefix "";
Packages::Extensions ".deb .udeb";
Sources "Sources";
BinOverride "override.sid.all3";
SrcOverride "override.sid.all3.src";
- SourceFileList "/org/ftp.debian.org/database/dists/unstable_accepted.list";
+ FileList "/srv/ftp.debian.org/database/dists/unstable_accepted.list";
};
Dir
{
- ArchiveDir "/org/ftp.debian.org/ftp/";
- OverrideDir "/org/ftp.debian.org/scripts/override/";
- CacheDir "/org/ftp.debian.org/database/";
+ ArchiveDir "/srv/ftp.debian.org/ftp/";
+ OverrideDir "/srv/ftp.debian.org/scripts/override/";
+ CacheDir "/srv/ftp.debian.org/database/";
};
Default
{
- Packages::Compress ". gzip";
- Sources::Compress "gzip";
+ Packages::Compress "gzip bzip2";
+ Sources::Compress "gzip bzip2";
Contents::Compress "gzip";
DeLinkLimit 0;
FileMode 0664;
TreeDefault
{
- Contents::Header "/org/ftp.debian.org/dak/config/debian/Contents.top";
+ Contents::Header "/srv/ftp.debian.org/dak/config/debian/Contents.top";
};
tree "dists/stable"
{
- FileList "/org/ftp.debian.org/database/dists/stable_$(SECTION)_binary-$(ARCH).list";
- SourceFileList "/org/ftp.debian.org/database/dists/stable_$(SECTION)_source.list";
+ FileList "/srv/ftp.debian.org/database/dists/stable_$(SECTION)_binary-$(ARCH).list";
+ SourceFileList "/srv/ftp.debian.org/database/dists/stable_$(SECTION)_source.list";
Sections "main contrib non-free";
- Architectures "alpha arm hppa i386 ia64 m68k mips mipsel powerpc s390 sparc source";
- BinOverride "override.sarge.$(SECTION)";
- ExtraOverride "override.sarge.extra.$(SECTION)";
- SrcOverride "override.sarge.$(SECTION).src";
+ Architectures "alpha amd64 arm hppa i386 ia64 mips mipsel powerpc s390 sparc source";
+ BinOverride "override.etch.$(SECTION)";
+ ExtraOverride "override.etch.extra.$(SECTION)";
+ SrcOverride "override.etch.$(SECTION).src";
};
// debian-installer
tree "dists/stable/main"
{
- FileList "/org/ftp.debian.org/database/dists/stable_main_$(SECTION)_binary-$(ARCH).list";
+ FileList "/srv/ftp.debian.org/database/dists/stable_main_$(SECTION)_binary-$(ARCH).list";
Sections "debian-installer";
- Architectures "alpha arm hppa i386 ia64 m68k mips mipsel powerpc s390 sparc";
- BinOverride "override.sarge.main.$(SECTION)";
- SrcOverride "override.sarge.main.src";
+ Architectures "alpha amd64 arm hppa i386 ia64 mips mipsel powerpc s390 sparc";
+ BinOverride "override.etch.main.$(SECTION)";
+ SrcOverride "override.etch.main.src";
BinCacheDB "packages-debian-installer-$(ARCH).db";
Packages::Extensions ".udeb";
- Contents " ";
+ Contents "$(DIST)/../Contents-udeb";
};
+
+tree "dists/stable/non-free"
+{
+ FileList "/srv/ftp.debian.org/database/dists/stable_non-free_$(SECTION)_binary-$(ARCH).list";
+ Sections "debian-installer";
+ Architectures "alpha amd64 arm hppa i386 ia64 mips mipsel powerpc s390 sparc";
+ BinOverride "override.etch.main.$(SECTION)";
+ SrcOverride "override.etch.main.src";
+ BinCacheDB "packages-debian-installer-$(ARCH).db";
+ Packages::Extensions ".udeb";
+ Contents "$(DIST)/../Contents-udeb-nf";
+};
+
for a in $ARCHS; do
cp /org/wanna-build/tmp/Packages.unstable.$a-old Packages
gzip -cd /org/incoming.debian.org/buildd/Packages.gz >> Packages
- quinn-diff -i -a /org/buildd.debian.org/web/quinn-diff/Packages-arch-specific -A $a 2>/dev/null | perl -pi -e 's#^(non-US/)?(non-free)/.*$##msg' | wanna-build -b $a/build-db --merge-partial-quinn 2> /dev/null
+ quinn-diff -i -a /org/buildd.debian.org/web/quinn-diff/Packages-arch-specific -A $a 2>/dev/null | perl -pi -e 's#^(non-free)/.*$##msg' | wanna-build -b $a/build-db --merge-partial-quinn 2> /dev/null
wanna-build -A $a -b $a/build-db --merge-packages Packages 2>/dev/null
done
rm -f Sources Packages
# Executed daily via cron, out of dak's crontab.
set -e
-export SCRIPTVARS=/org/ftp.debian.org/dak/config/debian/vars
+export SCRIPTVARS=/srv/ftp.debian.org/dak/config/debian/vars
. $SCRIPTVARS
################################################################################
NOTICE="$ftpdir/Archive_Maintenance_In_Progress"
LOCKCU="$lockdir/daily.lock"
LOCKAC="$lockdir/unchecked.lock"
+lockac=0
cleanup() {
rm -f "$NOTICE"
rm -f "$LOCKCU"
+ if [ "$lockac" -eq "1" ]; then
+ rm -f "$LOCKAC"
+ fi
}
+lockfile -l 3600 $LOCKCU
trap cleanup 0
rm -f "$NOTICE"
-lockfile -l 3600 $LOCKCU
cat > "$NOTICE" <<EOF
Packages are currently being installed and indices rebuilt.
Maintenance is automatic, starting at 13:52 US Central time, and
################################################################################
echo "Creating pre-daily-cron-job backup of projectb database..."
-pg_dump projectb > /org/ftp.debian.org/backup/dump_$(date +%Y.%m.%d-%H:%M:%S)
+pg_dump projectb > $base/backup/dump_$(date +%Y.%m.%d-%H:%M:%S)
################################################################################
################################################################################
+TS=$(($TS+1)); echo Archive maintenance timestamp $TS: $(date +%X)
+cd $queuedir/p-u-new
+date -u -R >> REPORT
+dak process-new -a -C COMMENTS >> REPORT
+echo >> REPORT
+
+TS=$(($TS+1)); echo Archive maintenance timestamp $TS: $(date +%X)
+cd $queuedir/o-p-u-new
+date -u -R >> REPORT
+dak process-new -a -C COMMENTS >> REPORT
+echo >> REPORT
+
+################################################################################
+
TS=$(($TS+1)); echo Archive maintenance timestamp $TS: $(date +%X)
lockfile $LOCKAC
+lockac=1
cd $accepted
rm -f REPORT
dak process-accepted -pa *.changes | tee REPORT | \
dak check-overrides
rm -f $LOCKAC
+lockac=0
symlinks -d -r $ftpdir
#cat $extoverridedir/task | perl -ne 'print if /^\S+\sTask\s\S+(,\s*\S+)*$/;' > override.sid.extra.main
# FIXME
-rm -f override.potato.all3 override.sid.all3
-for i in main contrib non-free; do cat override.potato.$i >> override.potato.all3; done
+rm -f override.sid.all3
for i in main contrib non-free main.debian-installer; do cat override.sid.$i >> override.sid.all3; done
TS=$(($TS+1)); echo Archive maintenance timestamp $TS: $(date +%X)
# Needs to be rebuilt, as files have moved. Due to unaccepts, we need to
# update this before wanna-build is updated.
psql projectb -A -t -q -c "SELECT filename FROM queue_build WHERE suite = 5 AND queue = 0 AND in_queue = true AND filename ~ 'd(sc|eb)$'" > $dbdir/dists/unstable_accepted.list
+symlinks -d /srv/incoming.debian.org/buildd > /dev/null
apt-ftparchive generate apt.conf.buildd
TS=$(($TS+1)); echo Archive maintenance timestamp $TS: $(date +%X)
./mkchecksums
#
# Fetch bugs information before unchecked processing is allowed again.
-/org/ftp.debian.org/testing/britney bugs
+$base/testing/britney allowdaklock bugs || true
rm -f $NOTICE
ssh buildd@buildd /org/wanna-build/trigger.daily
################################################################################
echo "Creating post-daily-cron-job backup of projectb database..."
-POSTDUMP=/org/ftp.debian.org/backup/dump_$(date +%Y.%m.%d-%H:%M:%S)
+POSTDUMP=$base/backup/dump_$(date +%Y.%m.%d-%H:%M:%S)
pg_dump projectb > $POSTDUMP
-(cd /org/ftp.debian.org/backup; ln -sf $POSTDUMP current)
+(cd $base/backup; ln -sf $POSTDUMP current)
################################################################################
# and one on crufty packages
dak cruft-report | tee $webdir/cruft-report-daily.txt | mail -e -s "Debian archive cruft report for $(date +%D)" ftpmaster@ftp-master.debian.org
+$scriptsdir/dm-monitor >$webdir/dm-uploaders.html
+
################################################################################
# Run mirror-split
TS=$(($TS+1)); echo Archive maintenance timestamp $TS: $(date +%X)
-ulimit -m 90000 -d 90000 -s 10000 -v 90000
+ulimit -m 90000 -d 90000 -s 10000 -v 200000
-run-parts --report /org/ftp.debian.org/scripts/distmnt
+run-parts --report $base/scripts/distmnt
echo Daily cron scripts successful.
R --slave --vanilla < $base/misc/ftpstats.R
TS=$(($TS+1)); echo Archive maintenance timestamp $TS: $(date +%X)
+
+# Clean up apt-ftparchive's databases
+
+cd $configdir
+apt-ftparchive -q clean apt.conf
+
+TS=$(($TS+1)); echo Archive maintenance timestamp $TS: $(date +%X)
+
+# Compress psql backups older than a month, but no more than 20 of them
+
+(cd $base/backup/
+ find -maxdepth 1 -mindepth 1 -type f -name 'dump_*' \! -name '*.bz2' \! -name '*.gz' -mtime +30 |
+ sort | head -n20 | while read dumpname; do
+ echo "Compressing $dumpname"
+ bzip2 -9 "$dumpname"
+ done
+)
+
+################################################################################
set -e
set -u
-export SCRIPTVARS=/org/ftp.debian.org/dak/config/debian/vars
+export SCRIPTVARS=/srv/ftp.debian.org/dak/config/debian/vars
. $SCRIPTVARS
+date -u > $ftpdir/project/trace/ftp-master.debian.org
dak import-users-from-passwd
dak queue-report -n > $webdir/new.html
+cd $queuedir/new ; dak show-new *.changes > /dev/null
set -e
set -u
-export SCRIPTVARS=/org/ftp.debian.org/dak/config/debian/vars
+export SCRIPTVARS=/srv/ftp.debian.org/dak/config/debian/vars
. $SCRIPTVARS
################################################################################
DATE=`date -d yesterday +%y%m`
-cd /org/ftp.debian.org/mail/archive
+cd /srv/ftp.debian.org/mail/archive
for m in mail bxamail; do
if [ -f $m ]; then
mv $m ${m}-$DATE
sleep 20
gzip -9 ${m}-$DATE
- chgrp debadmin ${m}-$DATE.gz
+ chgrp $ftpgroup ${m}-$DATE.gz
chmod 660 ${m}-$DATE.gz
fi;
done
set -e
set -u
-export SCRIPTVARS=/org/ftp.debian.org/dak/config/debian/vars
+export SCRIPTVARS=/srv/ftp.debian.org/dak/config/debian/vars
. $SCRIPTVARS
LOCKDAILY=""
LOCKFILE="$lockdir/unchecked.lock"
NOTICE="$lockdir/daily.lock"
+if [ -e $NOTICE ]; then exit 0; fi
+
cleanup() {
rm -f "$LOCKFILE"
if [ ! -z "$LOCKDAILY" ]; then
dak process-unchecked -a $changes >> $report
echo "--" >> $report
+ # sync with debbugs
+ $scriptsdir/ssh-move --server --ssh-identity /srv/ftp.debian.org/s3kr1t/id_debbugs-vt --ssh-move-path /home/debbugs/ssh-move --from-directory $queuedir/bts_version_track --to-directory /org/bugs.debian.org/versions/queue/ftp-master debbugs@bugs.debian.org \*.debinfo \*.versions
+
if lockfile -r3 $NOTICE; then
LOCKDAILY="YES"
psql projectb -A -t -q -c "SELECT filename FROM queue_build WHERE queue = 0 AND suite = 5 AND in_queue = true AND filename ~ 'd(sc|eb)$'" > $dbdir/dists/unstable_accepted.list
fi
done
cd $configdir
- apt-ftparchive -qq generate apt.conf.buildd
+ apt-ftparchive -qq -o APT::FTPArchive::Contents=off generate apt.conf.buildd
. $configdir/cron.buildd
fi
else
set -e
set -u
-export SCRIPTVARS=/org/ftp.debian.org/dak/config/debian/vars
+export SCRIPTVARS=/srv/ftp.debian.org/dak/config/debian/vars
. $SCRIPTVARS
################################################################################
Dinstall
{
- PGPKeyring "/org/keyring.debian.org/keyrings/debian-keyring.pgp";
- GPGKeyring "/org/keyring.debian.org/keyrings/debian-keyring.gpg";
- SigningKeyring "/org/ftp.debian.org/s3kr1t/dot-gnupg/secring.gpg";
- SigningPubKeyring "/org/ftp.debian.org/s3kr1t/dot-gnupg/pubring.gpg";
- SigningKeyIds "2D230C5F";
+ GPGKeyring {
+ "/srv/keyring.debian.org/keyrings/debian-keyring.gpg";
+ "/srv/keyring.debian.org/keyrings/debian-keyring.pgp";
+ "/srv/ftp.debian.org/keyrings/debian-maintainers.gpg";
+ };
+ SigningKeyring "/srv/ftp.debian.org/s3kr1t/dot-gnupg/secring.gpg";
+ SigningPubKeyring "/srv/ftp.debian.org/s3kr1t/dot-gnupg/pubring.gpg";
+ SigningKeyIds "6070D3A1";
SendmailCommand "/usr/sbin/sendmail -odq -oi -t";
MyEmailAddress "Debian Installer <installer@ftp-master.debian.org>";
MyAdminAddress "ftpmaster@debian.org";
BugServer "bugs.debian.org";
PackagesServer "packages.debian.org";
TrackingServer "packages.qa.debian.org";
- LockFile "/org/ftp.debian.org/lock/dinstall.lock";
+ LockFile "/srv/ftp.debian.org/lock/dinstall.lock";
Bcc "archive@ftp-master.debian.org";
- GroupOverrideFilename "override.group-maint";
FutureTimeTravelGrace 28800; // 8 hours
PastCutoffYear "1984";
SkipTime 300;
OverrideDisparityCheck "true";
StableDislocationSupport "false";
DefaultSuite "unstable";
+ UserExtensions "/srv/ftp.debian.org/dak/config/debian/extensions.py";
QueueBuildSuites
{
unstable;
};
};
+Binary-Upload-Restrictions
+{
+ Components
+ {
+ //main;
+ //contrib;
+ // Yay for consensus through GRs voted on by people not actually involved in the affected architectures
+ none;
+ };
+ unstable
+ {
+ arm
+ {
+ 9BF093BC475BABF8B6AEA5F6D7C3F131AB2A91F5;
+ 70BC7F9D8C60D2265B7076A23760DBCFFD6645AB;
+ F849E2025D1C194DE62BC6C829BE5D2268FD549F;
+ };
+ alpha
+ {
+ 9BF093BC475BABF8B6AEA5F6D7C3F131AB2A91F5;
+ 70BC7F9D8C60D2265B7076A23760DBCFFD6645AB;
+ };
+ };
+};
+
Generate-Index-Diffs
{
Options
{
- TempDir "/org/ftp.debian.org/tiffani";
- MaxDiffs { Default 90; };
+ TempDir "/srv/ftp.debian.org/tiffani";
+ MaxDiffs { Default 14; };
};
};
Mirror-Split
{
- FTPPath "/org/ftp.debian.org/ftp";
- TreeRootPath "/org/ftp.debian.org/scratch/dsilvers/treeroots";
- TreeDatabasePath "/org/ftp.debian.org/scratch/dsilvers/treedbs";
+ FTPPath "/srv/ftp.debian.org/ftp";
+ TreeRootPath "/srv/ftp.debian.org/scratch/dsilvers/treeroots";
+ TreeDatabasePath "/srv/ftp.debian.org/scratch/dsilvers/treedbs";
BasicTrees { alpha; arm; hppa; hurd-i386; i386; ia64; mips; mipsel; powerpc; s390; sparc; m68k };
CombinationTrees
{
};
};
+Show-New
+{
+ HTMLPath "/srv/ftp.debian.org/web/new/";
+}
+
Import-Users-From-Passwd
{
ValidGID "800";
// Comma separated list of users who are in Postgres but not the passwd file
- KnownPostgres "postgres,dak,katie";
+ KnownPostgres "postgres,dak,katie,release";
};
Clean-Queues
};
MyEmailAddress "Debian Archive Maintenance <ftpmaster@ftp-master.debian.org>";
- LogFile "/org/ftp.debian.org/web/removals.txt";
+ LogFile "/srv/ftp.debian.org/web/removals.txt";
Bcc "removed-packages@qa.debian.org";
};
Import-Archive
{
- ExportDir "/org/ftp.debian.org/dak/import-archive-files/";
+ ExportDir "/srv/ftp.debian.org/dak/import-archive-files/";
};
Reject-Proposed-Updates
{
StableRejector "Andreas Barth and Martin Zobel-Helas";
- MoreInfoURL "http://release.debian.org/stable/3.1/3.1r2/";
+ MoreInfoURL "http://release.debian.org/stable/4.0/4.0r3/";
};
Import-LDAP-Fingerprints
LDAPServer "db.debian.org";
ExtraKeyrings
{
- "/org/keyring.debian.org/keyrings/removed-keys.pgp";
- "/org/keyring.debian.org/keyrings/removed-keys.gpg";
- "/org/keyring.debian.org/keyrings/extra-keys.pgp";
+ "/srv/keyring.debian.org/keyrings/removed-keys.pgp";
+ "/srv/keyring.debian.org/keyrings/removed-keys.gpg";
+ "/srv/keyring.debian.org/keyrings/extra-keys.pgp";
};
KeyServer "wwwkeys.eu.pgp.net";
};
Process-New
{
- AcceptedLockFile "/org/ftp.debian.org/lock/unchecked.lock";
+ AcceptedLockFile "/srv/ftp.debian.org/lock/unchecked.lock";
};
Check-Overrides
sparc;
};
Announce "debian-changes@lists.debian.org";
- Version "3.0r6";
+ Version "3.1r6";
Origin "Debian";
- Description "Debian 3.0r6 Released 31 May 2005";
- CodeName "woody";
- OverrideCodeName "woody";
- Priority "1";
+ Description "Debian 3.1r6 Released 7 April 2007";
+ CodeName "sarge";
+ OverrideCodeName "sarge";
+ Priority "2";
Untouchable "1";
+ ChangeLogBase "dists/oldstable/";
+ UdebComponents
+ {
+ main;
+ };
};
- Stable
+ Oldstable-Proposed-Updates
{
Components
{
sparc;
};
Announce "debian-changes@lists.debian.org";
- Version "3.1r2";
+ CopyChanges "dists/oldstable-proposed-updates/";
+ CopyDotDak "/srv/ftp.debian.org/queue/oldstable-proposed-updates/";
+ CommentsDir "/srv/ftp.debian.org/queue/o-p-u-new/COMMENTS/";
+ Version "3.1-updates";
Origin "Debian";
- Description "Debian 3.1r2 Released 17 April 2006";
- CodeName "sarge";
+ Description "Debian 3.1 Proposed Updates - Not Released";
+ CodeName "sarge-proposed-updates";
OverrideCodeName "sarge";
- Priority "3";
+ OverrideSuite "oldstable";
+ Priority "2";
+ VersionChecks
+ {
+ MustBeNewerThan
+ {
+ Oldstable;
+ };
+ MustBeOlderThan
+ {
+ Stable;
+ Testing;
+ Unstable;
+ Experimental;
+ };
+ Enhances
+ {
+ Oldstable;
+ };
+ };
+ UdebComponents
+ {
+ main;
+ };
+ };
+
+ Stable
+ {
+ Components
+ {
+ main;
+ contrib;
+ non-free;
+ };
+ Architectures
+ {
+ source;
+ all;
+ alpha;
+ amd64;
+ arm;
+ hppa;
+ i386;
+ ia64;
+ mips;
+ mipsel;
+ powerpc;
+ s390;
+ sparc;
+ };
+ Announce "debian-changes@lists.debian.org";
+ // Version "4.0r1";
+ Origin "Debian";
+ // Description "Debian 4.0r1 Released 15 August 2007";
+ CodeName "etch";
+ OverrideCodeName "etch";
+ Priority "5";
Untouchable "1";
- ChangeLogBase "dists/stable/";
+ ChangeLogBase "dists/stable/";
UdebComponents
{
main;
+ non-free;
};
};
source;
all;
alpha;
+ amd64;
arm;
hppa;
i386;
ia64;
- m68k;
mips;
mipsel;
powerpc;
};
Announce "debian-changes@lists.debian.org";
CopyChanges "dists/proposed-updates/";
- CopyDotDak "/org/ftp.debian.org/queue/proposed-updates/";
- Version "3.1-updates";
+ CopyDotDak "/srv/ftp.debian.org/queue/proposed-updates/";
+ CommentsDir "/srv/ftp.debian.org/queue/p-u-new/COMMENTS/";
+ Version "4.0-updates";
Origin "Debian";
- Description "Debian 3.1 Proposed Updates - Not Released";
- CodeName "proposed-updates";
- OverrideCodeName "sarge";
+ Description "Debian 4.0 Proposed Updates - Not Released";
+ CodeName "etch-proposed-updates";
+ OverrideCodeName "etch";
OverrideSuite "stable";
Priority "4";
VersionChecks
hppa;
i386;
ia64;
- m68k;
mips;
mipsel;
powerpc;
Announce "debian-testing-changes@lists.debian.org";
Origin "Debian";
Description "Debian Testing distribution - Not Released";
- CodeName "etch";
- OverrideCodeName "etch";
+ CodeName "lenny";
+ OverrideCodeName "lenny";
Priority "5";
UdebComponents
{
main;
+ non-free;
};
};
Origin "Debian";
Description "Debian Testing distribution updates - Not Released";
CodeName "testing-proposed-updates";
- OverrideCodeName "etch";
+ OverrideCodeName "lenny";
OverrideSuite "testing";
Priority "6";
VersionChecks
UdebComponents
{
main;
+ non-free;
+ };
+ };
+
+ Etch-m68k
+ {
+ Components
+ {
+ main;
+ contrib;
+ non-free;
+ };
+ Architectures
+ {
+ source;
+ all;
+ m68k;
+ };
+ Announce "debian-testing-changes@lists.debian.org";
+ Origin "Debian";
+ Description "Debian Etch for m68k - Not Released";
+ CodeName "etch-m68k";
+ OverrideCodeName "etch";
+ Priority "5";
+ UdebComponents
+ {
+ main;
+ non-free;
};
};
+
Unstable
{
Components
UdebComponents
{
main;
+ non-free;
};
};
Unstable;
};
};
-
+ UdebComponents
+ {
+ main;
+ non-free;
+ };
};
};
SuiteMappings
{
+ "propup-version oldstable-security stable testing testing-proposed-updates unstable";
"propup-version stable-security testing testing-proposed-updates unstable";
"propup-version testing-security unstable";
+ "map oldstable oldstable-proposed-updates";
+ "map oldstable-security oldstable-proposed-updates";
"map stable proposed-updates";
"map stable-security proposed-updates";
+ "map-unreleased oldstable unstable";
"map-unreleased stable unstable";
"map-unreleased proposed-updates unstable";
"map testing testing-proposed-updates";
"map-unreleased testing-proposed-updates unstable";
};
+AutomaticByHandPackages {
+ "debian-installer-images" {
+ Source "xxx-debian-installer";
+ Section "raw-installer";
+ Extension "tar.gz";
+ Script "/srv/ftp.debian.org/dak/scripts/debian/byhand-di";
+ };
+
+ "debian-maintainers" {
+ Source "debian-maintainers";
+ Section "raw-keyring";
+ Extension "gpg";
+ Script "/srv/ftp.debian.org/dak/scripts/debian/byhand-dm";
+ };
+
+ "tag-overrides" {
+ Source "tag-overrides";
+ Section "byhand";
+ Extension "tar.gz";
+ Script "/srv/ftp.debian.org/dak/scripts/debian/byhand-tag";
+ };
+};
+
Dir
{
- Root "/org/ftp.debian.org/ftp/";
- Pool "/org/ftp.debian.org/ftp/pool/";
- Templates "/org/ftp.debian.org/dak/templates/";
+ Root "/srv/ftp.debian.org/ftp/";
+ Pool "/srv/ftp.debian.org/ftp/pool/";
+ Templates "/srv/ftp.debian.org/dak/templates/";
PoolRoot "pool/";
- Lists "/org/ftp.debian.org/database/dists/";
- Log "/org/ftp.debian.org/log/";
- Lock "/org/ftp.debian.org/lock";
- Morgue "/org/ftp.debian.org/morgue/";
+ Lists "/srv/ftp.debian.org/database/dists/";
+ Log "/srv/ftp.debian.org/log/";
+ Lock "/srv/ftp.debian.org/lock";
+ Morgue "/srv/ftp.debian.org/morgue/";
MorgueReject "reject";
- Override "/org/ftp.debian.org/scripts/override/";
- QueueBuild "/org/incoming.debian.org/buildd/";
- UrgencyLog "/org/ftp.debian.org/testing/urgencies/";
+ Override "/srv/ftp.debian.org/scripts/override/";
+ QueueBuild "/srv/incoming.debian.org/buildd/";
+ UrgencyLog "/srv/ftp.debian.org/testing/urgencies/";
Queue
{
- Accepted "/org/ftp.debian.org/queue/accepted/";
- Byhand "/org/ftp.debian.org/queue/byhand/";
- ProposedUpdates "/org/ftp.debian.org/queue/p-u-new/";
- Done "/org/ftp.debian.org/queue/done/";
- Holding "/org/ftp.debian.org/queue/holding/";
- New "/org/ftp.debian.org/queue/new/";
- Reject "/org/ftp.debian.org/queue/reject/";
- Unchecked "/org/ftp.debian.org/queue/unchecked/";
- BTSVersionTrack "/org/ftp.debian.org/queue/bts_version_track/";
+ Accepted "/srv/ftp.debian.org/queue/accepted/";
+ Byhand "/srv/ftp.debian.org/queue/byhand/";
+ ProposedUpdates "/srv/ftp.debian.org/queue/p-u-new/";
+ OldProposedUpdates "/srv/ftp.debian.org/queue/o-p-u-new/";
+ Done "/srv/ftp.debian.org/queue/done/";
+ Holding "/srv/ftp.debian.org/queue/holding/";
+ New "/srv/ftp.debian.org/queue/new/";
+ Reject "/srv/ftp.debian.org/queue/reject/";
+ Unchecked "/srv/ftp.debian.org/queue/unchecked/";
+ BTSVersionTrack "/srv/ftp.debian.org/queue/bts_version_track/";
};
};
Name "projectb";
Host "";
Port -1;
-
- NonUSName "projectb";
- NonUSHost "non-US.debian.org";
- NonUSPort -1;
- NonUSUser "auric";
- NonUSPassword "moo";
};
Architectures
{
// Pool locations on ftp-master.debian.org
- /org/ftp.debian.org/ftp/pool/
+ /srv/ftp.debian.org/ftp/pool/
{
Archive "ftp-master";
Type "pool";
-base Base system (baseX_Y.tgz) general bugs
-install Installation system
-installation Installation system
+base Base system general bugs
cdrom Installation system
-boot-floppy Installation system
spam Spam (reassign spam to here so we can complain about it)
press Press release issues
kernel Problems with the Linux kernel, or that shipped with Debian
project Problems related to project administration
general General problems (e.g. "many manpages are mode 755")
-slink-cd Slink CD
-potato-cd Potato CD
listarchives Problems with the WWW mailing list archives
+nm.debian.org New Maintainer process and nm.debian.org webpages
qa.debian.org The Quality Assurance group
ftp.debian.org Problems with the FTP site
www.debian.org Problems with the WWW site
bugs.debian.org The bug tracking system, @bugs.debian.org
-nonus.debian.org Problems with the non-US FTP site
lists.debian.org The mailing lists, debian-*@lists.debian.org
wnpp Work-Needing and Prospective Packages list
cdimage.debian.org CD Image issues
-base Anthony Towns <debootstrap@packages.debian.org>
-install Debian Install Team <debian-boot@lists.debian.org>
-installation Debian Install Team <debian-boot@lists.debian.org>
+base Base Maintainers <virtual-pkg-base-maintainers@lists.alioth.debian.org>
cdrom Debian CD-ROM Team <debian-cd@lists.debian.org>
-boot-floppy Debian Install Team <debian-boot@lists.debian.org>
press press@debian.org
bugs.debian.org Debian Bug Tracking Team <owner@bugs.debian.org>
ftp.debian.org James Troup and others <ftpmaster@ftp-master.debian.org>
+nm.debian.org New Maintainer Front-Desk <new-maintainer@debian.org>
qa.debian.org debian-qa@lists.debian.org
-nonus.debian.org Michael Beattie and others <ftpmaster@debian.org>
www.debian.org Debian WWW Team <debian-www@lists.debian.org>
mirrors Debian Mirrors Team <mirrors@debian.org>
listarchives Debian List Archive Team <listarchives@debian.org>
kernel Debian Kernel Team <debian-kernel@lists.debian.org>
lists.debian.org Debian Listmaster Team <listmaster@lists.debian.org>
spam spam@debian.org
-slink-cd Steve McIntyre <stevem@chiark.greenend.org.uk>
-potato-cd Steve McIntyre <stevem@chiark.greenend.org.uk>
wnpp wnpp@debian.org
cdimage.debian.org Debian CD-ROM Team <debian-cd@lists.debian.org>
tech-ctte Technical Committee <debian-ctte@lists.debian.org>
# locations used by many scripts
-base=/org/ftp.debian.org
+base=/srv/ftp.debian.org
ftpdir=$base/ftp
webdir=$base/web
indices=$ftpdir/indices
ftpgroup=debadmin
-copyoverrides="etch.contrib etch.contrib.src etch.main etch.main.src etch.non-free etch.non-free.src etch.extra.main etch.extra.non-free etch.extra.contrib etch.main.debian-installer woody.contrib woody.contrib.src woody.main woody.main.src woody.non-free woody.non-free.src sarge.contrib sarge.contrib.src sarge.main sarge.main.src sarge.non-free sarge.non-free.src sid.contrib sid.contrib.src sid.main sid.main.debian-installer sid.main.src sid.non-free sid.non-free.src sid.extra.contrib sid.extra.main sid.extra.non-free woody.extra.contrib woody.extra.main woody.extra.non-free sarge.extra.contrib sarge.extra.main sarge.extra.non-free"
+copyoverrides="etch.contrib etch.contrib.src etch.main etch.main.src etch.non-free etch.non-free.src etch.extra.main etch.extra.non-free etch.extra.contrib etch.main.debian-installer sarge.contrib sarge.contrib.src sarge.main sarge.main.src sarge.non-free sarge.non-free.src sid.contrib sid.contrib.src sid.main sid.main.debian-installer sid.main.src sid.non-free sid.non-free.src sid.extra.contrib sid.extra.main sid.extra.non-free sarge.extra.contrib sarge.extra.main sarge.extra.non-free lenny.contrib lenny.contrib.src lenny.main lenny.main.src lenny.non-free lenny.non-free.src lenny.extra.main lenny.extra.contrib lenny.extra.non-free"
PATH=$masterdir:$PATH
umask 022
filename = filename.replace('potato-proposed-updates', 'proposed-updates')
if os.path.isfile(filename) and not os.path.islink(filename) and not db_files.has_key(filename) and not excluded.has_key(filename):
waste += os.stat(filename)[stat.ST_SIZE]
- print filename
+ print "%s" % (filename)
################################################################################
global db_files
print "Building list of database files..."
- q = projectB.query("SELECT l.path, f.filename FROM files f, location l WHERE f.location = l.id")
+ q = projectB.query("SELECT l.path, f.filename, f.last_used FROM files f, location l WHERE f.location = l.id ORDER BY l.path, f.filename")
ql = q.getresult()
+ print "Missing files:"
db_files.clear()
for i in ql:
filename = os.path.abspath(i[0] + i[1])
db_files[filename] = ""
if os.access(filename, os.R_OK) == 0:
- daklib.utils.warn("'%s' doesn't exist." % (filename))
+ if i[2]:
+ print "(last used: %s) %s" % (i[2], filename)
+ else:
+ print "%s" % (filename)
+
filename = Cnf["Dir::Override"]+'override.unreferenced'
if os.path.exists(filename):
filename = filename[:-1]
excluded[filename] = ""
- print "Checking against existent files..."
+ print "Existent files not in db:"
os.path.walk(Cnf["Dir::Root"]+'pool/', process_dir, None)
before = time.time()
sys.stdout.write("[Deleting from source table... ")
projectB.query("DELETE FROM dsc_files WHERE EXISTS (SELECT 1 FROM source s, files f, dsc_files df WHERE f.last_used <= '%s' AND s.file = f.id AND s.id = df.source AND df.id = dsc_files.id)" % (delete_date))
+ projectB.query("DELETE FROM src_uploaders WHERE EXISTS (SELECT 1 FROM source s, files f WHERE f.last_used <= '%s' AND s.file = f.id AND s.id = src_uploaders.source)" % (delete_date))
projectB.query("DELETE FROM source WHERE EXISTS (SELECT 1 FROM files WHERE source.file = files.id AND files.last_used <= '%s')" % (delete_date))
sys.stdout.write("done. (%d seconds)]\n" % (int(time.time()-before)))
q = projectB.query("""
SELECT m.id FROM maintainer m
WHERE NOT EXISTS (SELECT 1 FROM binaries b WHERE b.maintainer = m.id)
- AND NOT EXISTS (SELECT 1 FROM source s WHERE s.maintainer = m.id)""")
+ AND NOT EXISTS (SELECT 1 FROM source s WHERE s.maintainer = m.id)
+ AND NOT EXISTS (SELECT 1 FROM src_uploaders u WHERE u.maintainer = m.id)""")
ql = q.getresult()
count = 0
q = projectB.query("""
SELECT f.id FROM fingerprint f
- WHERE NOT EXISTS (SELECT 1 FROM binaries b WHERE b.sig_fpr = f.id)
+ WHERE f.keyring IS NULL
+ AND NOT EXISTS (SELECT 1 FROM binaries b WHERE b.sig_fpr = f.id)
AND NOT EXISTS (SELECT 1 FROM source s WHERE s.sig_fpr = f.id)""")
ql = q.getresult()
if action == "list":
list(suite, component, type)
else:
+ if Cnf.has_key("Suite::%s::Untouchable" % suite) and Cnf["Suite::%s::Untouchable" % suite] != 0:
+ daklib.utils.fubar("%s: suite is untouchable" % suite)
+
Logger = daklib.logging.Logger(Cnf, "control-overrides")
if file_list:
for file in file_list:
ql = q.getresult()
if not ql:
- daklib.utils.warn("Couldn't find '%s~%s~%s'." % (package, version, architecture))
+ daklib.utils.warn("Couldn't find '%s_%s_%s'." % (package, version, architecture))
return None
if len(ql) > 1:
- daklib.utils.warn("Found more than one match for '%s~%s~%s'." % (package, version, architecture))
+ daklib.utils.warn("Found more than one match for '%s_%s_%s'." % (package, version, architecture))
return None
id = ql[0][0]
return id
# Take action
if action == "add":
if assoication_id:
- daklib.utils.warn("'%s~%s~%s' already exists in suite %s." % (package, version, architecture, suite))
+ daklib.utils.warn("'%s_%s_%s' already exists in suite %s." % (package, version, architecture, suite))
continue
else:
q = projectB.query("INSERT INTO src_associations (suite, source) VALUES (%s, %s)" % (suite_id, id))
elif action == "remove":
if assoication_id == None:
- daklib.utils.warn("'%s~%s~%s' doesn't exist in suite %s." % (package, version, architecture, suite))
+ daklib.utils.warn("'%s_%s_%s' doesn't exist in suite %s." % (package, version, architecture, suite))
continue
else:
q = projectB.query("DELETE FROM src_associations WHERE id = %s" % (assoication_id))
# Take action
if action == "add":
if assoication_id:
- daklib.utils.warn("'%s~%s~%s' already exists in suite %s." % (package, version, architecture, suite))
+ daklib.utils.warn("'%s_%s_%s' already exists in suite %s." % (package, version, architecture, suite))
continue
else:
q = projectB.query("INSERT INTO bin_associations (suite, bin) VALUES (%s, %s)" % (suite_id, id))
elif action == "remove":
if assoication_id == None:
- daklib.utils.warn("'%s~%s~%s' doesn't exist in suite %s." % (package, version, architecture, suite))
+ daklib.utils.warn("'%s_%s_%s' doesn't exist in suite %s." % (package, version, architecture, suite))
continue
else:
q = projectB.query("DELETE FROM bin_associations WHERE id = %s" % (assoication_id))
daklib.utils.fubar("No action specified.")
# Safety/Sanity check
- if action == "set" and suite != "testing":
+ if action == "set" and suite not in ["testing", "etch-m68k"]:
daklib.utils.fubar("Will not reset a suite other than testing.")
if action == "list":
# you might as well write some letters to God about how unfair entropy
# is while you're at it.'' -- 20020802143104.GA5628@azure.humbug.org.au
-## TODO: fix NBS looping for version, implement Dubious NBS, fix up output of duplicate source package stuff, improve experimental ?, add support for non-US ?, add overrides, avoid ANAIS for duplicated packages
+## TODO: fix NBS looping for version, implement Dubious NBS, fix up output of duplicate source package stuff, improve experimental ?, add overrides, avoid ANAIS for duplicated packages
################################################################################
def do_obsolete_source(duplicate_bins, bin2source):
obsolete = {}
for key in duplicate_bins.keys():
- (source_a, source_b) = key.split('~')
+ (source_a, source_b) = key.split('_')
for source in [ source_a, source_b ]:
if not obsolete.has_key(source):
if not source_binaries.has_key(source):
if bin_pkgs.has_key(binary):
key_list = [ source, bin_pkgs[binary] ]
key_list.sort()
- key = '~'.join(key_list)
+ key = '_'.join(key_list)
duplicate_bins.setdefault(key, [])
duplicate_bins[key].append(binary)
bin_pkgs[binary] = source
if previous_source != source:
key_list = [ source, previous_source ]
key_list.sort()
- key = '~'.join(key_list)
+ key = '_'.join(key_list)
duplicate_bins.setdefault(key, [])
if package not in duplicate_bins[key]:
duplicate_bins[key].append(package)
keys = duplicate_bins.keys()
keys.sort()
for key in keys:
- (source_a, source_b) = key.split("~")
+ (source_a, source_b) = key.split("_")
print " o %s & %s => %s" % (source_a, source_b, ", ".join(duplicate_bins[key]))
print
################################################################################
-import sys
+import sys, imp
import daklib.utils
################################################################################
+class UserExtension:
+ def __init__(self, user_extension = None):
+ if user_extension:
+ m = imp.load_source("dak_userext", user_extension)
+ d = m.__dict__
+ else:
+ m, d = None, {}
+ self.__dict__["_module"] = m
+ self.__dict__["_d"] = d
+
+ def __getattr__(self, a):
+ if a in self.__dict__: return self.__dict__[a]
+ if a[0] == "_": raise AttributeError, a
+ return self._d.get(a, None)
+
+ def __setattr__(self, a, v):
+ self._d[a] = v
+
+################################################################################
+
def init():
"""Setup the list of modules and brief explanation of what they
do."""
"Archive sanity checks"),
("queue-report",
"Produce a report on NEW and BYHAND packages"),
+ ("show-new",
+ "Output html for packages in NEW"),
("rm",
"Remove packages from suites"),
"Check for users with no packages in the archive"),
("import-archive",
"Populate SQL database based from an archive tree"),
+ ("import-keyring",
+ "Populate fingerprint/uid table based on a new/updated keyring"),
("import-ldap-fingerprints",
"Syncs fingerprint and uid tables with Debian LDAP db"),
("import-users-from-passwd",
def main():
"""Launch dak functionality."""
+ Cnf = daklib.utils.get_conf()
+
+ if Cnf.has_key("Dinstall::UserExtensions"):
+ userext = UserExtension(Cnf["Dinstall::UserExtensions"])
+ else:
+ userext = UserExtension()
+
functionality = init()
modules = [ command for (command, _) in functionality ]
# Invoke the module
module = __import__(cmdname.replace("-","_"))
+
+ module.dak_userext = userext
+ userext.dak_module = module
+ if userext.init is not None: userext.init(cmdname)
+
module.main()
################################################################################
################################################################################
-import errno, os, pg, re, sys
+import errno, os, pg, re, sys, md5
import apt_pkg, apt_inst
-import daklib.database, daklib.utils
+import daklib.database, daklib.utils, daklib.queue
################################################################################
re_newlinespace = re.compile('\n')
re_spacestrip = re.compile('(\s)')
-################################################################################
-
-# Colour definitions
-
-# Main
-main_colour = "\033[36m"
-# Contrib
-contrib_colour = "\033[33m"
-# Non-Free
-nonfree_colour = "\033[31m"
-# Arch
-arch_colour = "\033[32m"
-# End
-end_colour = "\033[0m"
-# Bold
-bold_colour = "\033[1m"
-# Bad maintainer
-maintainer_colour = arch_colour
+html_escaping = {'"':'"', '&':'&', '<':'<', '>':'>'}
+re_html_escaping = re.compile('|'.join(map(re.escape, html_escaping.keys())))
################################################################################
projectB = pg.connect(Cnf["DB::Name"], Cnf["DB::Host"], int(Cnf["DB::Port"]))
daklib.database.init(Cnf, projectB)
+printed_copyrights = {}
+
+# default is to not output html.
+use_html = 0
+
################################################################################
def usage (exit_code=0):
Check NEW package(s).
-h, --help show this help and exit
+ -H, --html-output output html page with inspection result
+ -f, --file-name filename for the html page
PACKAGE can be a .changes, .dsc, .deb or .udeb filename."""
sys.exit(exit_code)
+################################################################################
+# probably xml.sax.saxutils would work as well
+
+def html_escape(s):
+ return re_html_escaping.sub(lambda x: html_escaping.get(x.group(0)), s)
+
+def escape_if_needed(s):
+ if use_html:
+ return re_html_escaping.sub(html_escaping.get, s)
+ else:
+ return s
+
+def headline(s, level=2):
+ if use_html:
+ print "<h%d>%s</h%d>" % (level, html_escape(s), level)
+ else:
+ print "---- %s ----" % (s)
+
+# Colour definitions, 'end' isn't really for use
+
+ansi_colours = {
+ 'main': "\033[36m",
+ 'contrib': "\033[33m",
+ 'nonfree': "\033[31m",
+ 'arch': "\033[32m",
+ 'end': "\033[0m",
+ 'bold': "\033[1m",
+ 'maintainer': "\033[32m"}
+
+html_colours = {
+ 'main': ('<span style="color: aqua">',"</span>"),
+ 'contrib': ('<span style="color: yellow">',"</span>"),
+ 'nonfree': ('<span style="color: red">',"</span>"),
+ 'arch': ('<span style="color: green">',"</span>"),
+ 'bold': ('<span style="font-weight: bold">',"</span>"),
+ 'maintainer': ('<span style="color: green">',"</span>")}
+
+def colour_output(s, colour):
+ if use_html:
+ return ("%s%s%s" % (html_colours[colour][0], html_escape(s), html_colours[colour][1]))
+ else:
+ return ("%s%s%s" % (ansi_colours[colour], s, ansi_colours['end']))
+
+def print_escaped_text(s):
+ if use_html:
+ print "<pre>%s</pre>" % (s)
+ else:
+ print s
+
+def print_formatted_text(s):
+ if use_html:
+ print "<pre>%s</pre>" % (html_escape(s))
+ else:
+ print s
+
################################################################################
def get_depends_parts(depend) :
extracts = apt_inst.debExtractControl(deb_file)
control = apt_pkg.ParseSection(extracts)
except:
- print "can't parse control info"
+ print_formatted_text("can't parse control info")
+ # TV-COMMENT: this will raise exceptions in two lines
control = ''
deb_file.close()
nf_match = re_nonfree.search(section_str)
if c_match :
# contrib colour
- section = contrib_colour + section_str + end_colour
+ section = colour_output(section_str, 'contrib')
elif nf_match :
# non-free colour
- section = nonfree_colour + section_str + end_colour
+ section = colour_output(section_str, 'nonfree')
else :
# main
- section = main_colour + section_str + end_colour
+ section = colour_output(section_str, 'main')
if control.has_key("Architecture"):
arch_str = control.Find("Architecture")
- arch = arch_colour + arch_str + end_colour
+ arch = colour_output(arch_str, 'arch')
if control.has_key("Maintainer"):
maintainer = control.Find("Maintainer")
localhost = re_localhost.search(maintainer)
if localhost:
#highlight bad email
- maintainer = maintainer_colour + maintainer + end_colour
+ maintainer = colour_output(maintainer, 'maintainer')
+ else:
+ maintainer = escape_if_needed(maintainer)
return (control, control_keys, section, depends, recommends, arch, maintainer)
try:
dsc = daklib.utils.parse_changes(dsc_filename)
except:
- print "can't parse control info"
+ print_formatted_text("can't parse control info")
dsc_file.close()
- filecontents = strip_pgp_signature(dsc_filename)
+ filecontents = escape_if_needed(strip_pgp_signature(dsc_filename))
if dsc.has_key("build-depends"):
builddep = split_depends(dsc["build-depends"])
if dsc.has_key("architecture") :
if (dsc["architecture"] != "any"):
- newarch = arch_colour + dsc["architecture"] + end_colour
+ newarch = colour_output(dsc["architecture"], 'arch')
filecontents = re_arch.sub("Architecture: " + newarch, filecontents)
return filecontents
if ql:
i = ql[0]
+ adepends = d['name']
+ if d['version'] != '' :
+ adepends += " (%s)" % (d['version'])
+
if i[2] == "contrib":
- result += contrib_colour + d['name']
+ result += colour_output(adepends, "contrib")
elif i[2] == "non-free":
- result += nonfree_colour + d['name']
+ result += colour_output(adepends, "nonfree")
else :
- result += main_colour + d['name']
-
- if d['version'] != '' :
- result += " (%s)" % (d['version'])
- result += end_colour
+ result += colour_output(adepends, "main")
else:
- result += bold_colour + d['name']
+ adepends = d['name']
if d['version'] != '' :
- result += " (%s)" % (d['version'])
- result += end_colour
+ adepends += " (%s)" % (d['version'])
+ result += colour_output(adepends, "bold")
or_count += 1
comma_count += 1
return result
def output_deb_info(filename):
(control, control_keys, section, depends, recommends, arch, maintainer) = read_control(filename)
+ to_print = ""
if control == '':
- print "no control info"
+ print_formatted_text("no control info")
else:
for key in control_keys :
output = " " + key + ": "
elif key == 'Description':
desc = control.Find(key)
desc = re_newlinespace.sub('\n ', desc)
- output += desc
+ output += escape_if_needed(desc)
else:
- output += control.Find(key)
- print output
+ output += escape_if_needed(control.Find(key))
+ to_print += output + '\n'
+ print_escaped_text(to_print)
-def do_command (command, filename):
+def do_command (command, filename, escaped=0):
o = os.popen("%s %s" % (command, filename))
- print o.read()
+ if escaped:
+ print_escaped_text(o.read())
+ else:
+ print_formatted_text(o.read())
+
+def do_lintian (filename):
+ if use_html:
+ do_command("lintian --show-overrides --color html", filename, 1)
+ else:
+ do_command("lintian --show-overrides --color always", filename, 1)
def print_copyright (deb_filename):
package = re_package.sub(r'\1', deb_filename)
- o = os.popen("ar p %s data.tar.gz | tar tzvf - | egrep 'usr(/share)?/doc/[^/]*/copyright' | awk '{ print $6 }' | head -n 1" % (deb_filename))
+ o = os.popen("dpkg-deb -c %s | egrep 'usr(/share)?/doc/[^/]*/copyright' | awk '{print $6}' | head -n 1" % (deb_filename))
copyright = o.read()[:-1]
if copyright == "":
- print "WARNING: No copyright found, please check package manually."
+ print_formatted_text("WARNING: No copyright found, please check package manually.")
return
doc_directory = re_doc_directory.sub(r'\1', copyright)
if package != doc_directory:
- print "WARNING: wrong doc directory (expected %s, got %s)." % (package, doc_directory)
+ print_formatted_text("WARNING: wrong doc directory (expected %s, got %s)." % (package, doc_directory))
return
- o = os.popen("ar p %s data.tar.gz | tar xzOf - %s" % (deb_filename, copyright))
- print o.read()
+ o = os.popen("dpkg-deb --fsys-tarfile %s | tar xvOf - %s 2>/dev/null" % (deb_filename, copyright))
+ copyright = o.read()
+ copyrightmd5 = md5.md5(copyright).hexdigest()
+
+ if printed_copyrights.has_key(copyrightmd5) and printed_copyrights[copyrightmd5] != "%s (%s)" % (package, deb_filename):
+ print_formatted_text( "NOTE: Copyright is the same as %s.\n" % \
+ (printed_copyrights[copyrightmd5]))
+ else:
+ printed_copyrights[copyrightmd5] = "%s (%s)" % (package, deb_filename)
+
+ print_formatted_text(copyright)
def check_dsc (dsc_filename):
- print "---- .dsc file for %s ----" % (dsc_filename)
+ headline(".dsc file for %s" % (dsc_filename))
(dsc) = read_dsc(dsc_filename)
- print dsc
+ print_escaped_text(dsc)
+ headline("lintian check for %s" % (dsc_filename))
+ do_lintian(dsc_filename)
def check_deb (deb_filename):
filename = os.path.basename(deb_filename)
else:
is_a_udeb = 0
- print "---- control file for %s ----" % (filename)
+ headline("control file for %s" % (filename))
#do_command ("dpkg -I", deb_filename)
output_deb_info(deb_filename)
if is_a_udeb:
- print "---- skipping lintian check for µdeb ----"
+ headline("skipping lintian check for udeb")
print
else:
- print "---- lintian check for %s ----" % (filename)
- do_command ("lintian", deb_filename)
- print "---- linda check for %s ----" % (filename)
- do_command ("linda", deb_filename)
+ headline("lintian check for %s" % (filename))
+ do_lintian(deb_filename)
- print "---- contents of %s ----" % (filename)
+ headline("contents of %s" % (filename))
do_command ("dpkg -c", deb_filename)
if is_a_udeb:
- print "---- skipping copyright for µdeb ----"
+ headline("skipping copyright for udeb")
else:
- print "---- copyright of %s ----" % (filename)
+ headline("copyright of %s" % (filename))
print_copyright(deb_filename)
- print "---- file listing of %s ----" % (filename)
+ headline("file listing of %s" % (filename))
do_command ("ls -l", deb_filename)
# Read a file, strip the signature and return the modified contents as
# Display the .changes [without the signature]
def display_changes (changes_filename):
- print "---- .changes file for %s ----" % (changes_filename)
- print strip_pgp_signature(changes_filename)
+ headline(".changes file for %s" % (changes_filename))
+ print_formatted_text(strip_pgp_signature(changes_filename))
def check_changes (changes_filename):
display_changes(changes_filename)
# Cnf = daklib.utils.get_conf()
- Arguments = [('h',"help","Examine-Package::Options::Help")]
- for i in [ "help" ]:
- if not Cnf.has_key("Frenanda::Options::%s" % (i)):
+ Arguments = [('h',"help","Examine-Package::Options::Help"),
+ ('H',"html-output","Examine-Package::Options::Html-Output"),
+ ]
+ for i in [ "Help", "Html-Output", "partial-html" ]:
+ if not Cnf.has_key("Examine-Package::Options::%s" % (i)):
Cnf["Examine-Package::Options::%s" % (i)] = ""
args = apt_pkg.ParseCommandLine(Cnf,Arguments,sys.argv)
for file in args:
try:
- # Pipe output for each argument through less
- less_fd = os.popen("less -R -", 'w', 0)
- # -R added to display raw control chars for colour
- sys.stdout = less_fd
-
+ if not Options["Html-Output"]:
+ # Pipe output for each argument through less
+ less_fd = os.popen("less -R -", 'w', 0)
+ # -R added to display raw control chars for colour
+ sys.stdout = less_fd
try:
if file.endswith(".changes"):
check_changes(file)
else:
daklib.utils.fubar("Unrecognised file type: '%s'." % (file))
finally:
- # Reset stdout here so future less invocations aren't FUBAR
- less_fd.close()
- sys.stdout = stdout_fd
+ if not Options["Html-Output"]:
+ # Reset stdout here so future less invocations aren't FUBAR
+ less_fd.close()
+ sys.stdout = stdout_fd
except IOError, e:
if errno.errorcode[e.errno] == 'EPIPE':
daklib.utils.warn("[examine-package] Caught EPIPE; skipping.")
################################################################################
-import sys, os, popen2, tempfile, stat, time
+import sys, os, popen2, tempfile, stat, time, pg
import apt_pkg
import daklib.utils
Generate Release files (for SUITE).
-h, --help show this help and exit
+ -a, --apt-conf FILE use FILE instead of default apt.conf
+ -f, --force-touch ignore Untouchable directives in dak.conf
If no SUITE is given Release files are generated for all suites."""
def print_sha1_files (tree, files):
print_md5sha_files (tree, files, apt_pkg.sha1sum)
+def print_sha256_files (tree, files):
+ print_md5sha_files (tree, files, apt_pkg.sha256sum)
+
################################################################################
def main ():
Cnf = daklib.utils.get_conf()
- Arguments = [('h',"help","Generate-Releases::Options::Help")]
- for i in [ "help" ]:
+ Arguments = [('h',"help","Generate-Releases::Options::Help"),
+ ('a',"apt-conf","Generate-Releases::Options::Apt-Conf", "HasArg"),
+ ('f',"force-touch","Generate-Releases::Options::Force-Touch"),
+ ]
+ for i in [ "help", "apt-conf", "force-touch" ]:
if not Cnf.has_key("Generate-Releases::Options::%s" % (i)):
Cnf["Generate-Releases::Options::%s" % (i)] = ""
if Options["Help"]:
usage()
+ if not Options["Apt-Conf"]:
+ Options["Apt-Conf"] = daklib.utils.which_apt_conf_file()
+
AptCnf = apt_pkg.newConfiguration()
- apt_pkg.ReadConfigFileISC(AptCnf,daklib.utils.which_apt_conf_file())
+ apt_pkg.ReadConfigFileISC(AptCnf, Options["Apt-Conf"])
+
+ projectB = pg.connect(Cnf["DB::Name"], Cnf["DB::Host"], int(Cnf["DB::Port"]))
if not suites:
suites = Cnf.SubTree("Suite").List()
print "Processing: " + suite
SuiteBlock = Cnf.SubTree("Suite::" + suite)
- if SuiteBlock.has_key("Untouchable"):
+ if SuiteBlock.has_key("Untouchable") and not Options["Force-Touch"]:
print "Skipping: " + suite + " (untouchable)"
continue
origin = SuiteBlock["Origin"]
label = SuiteBlock.get("Label", origin)
- version = SuiteBlock.get("Version", "")
codename = SuiteBlock.get("CodeName", "")
+ version = ""
+ description = ""
+
+ q = projectB.query("SELECT version, description FROM suite WHERE suite_name = '%s'" % (suite))
+ qs = q.getresult()
+ if len(qs) == 1:
+ if qs[0][0] != "-": version = qs[0][0]
+ if qs[0][1]: description = qs[0][1]
+
if SuiteBlock.has_key("NotAutomatic"):
notautomatic = "yes"
else:
if components:
out.write("Components: %s\n" % (" ".join(components)))
- out.write("Description: %s\n" % (SuiteBlock["Description"]))
+ if description:
+ out.write("Description: %s\n" % (description))
files = []
relpath = Cnf["Dir::Root"]+tree+"/"+rel
try:
+ if os.access(relpath, os.F_OK):
+ if os.stat(relpath).st_nlink > 1:
+ os.unlink(relpath)
release = open(relpath, "w")
#release = open(longsuite.replace("/","_") + "_" + arch + "_" + sec + "_Release", "w")
except IOError:
files.append(rel)
if AptCnf.has_key("tree::%s/main" % (tree)):
- sec = AptCnf["tree::%s/main::Sections" % (tree)].split()[0]
- if sec != "debian-installer":
- print "ALERT: weird non debian-installer section in %s" % (tree)
-
- for arch in AptCnf["tree::%s/main::Architectures" % (tree)].split():
- if arch != "source": # always true
- for file in compressnames("tree::%s/main" % (tree), "Packages", "main/%s/binary-%s/Packages" % (sec, arch)):
- files.append(file)
+ for dis in ["main", "contrib", "non-free"]:
+ if not AptCnf.has_key("tree::%s/%s" % (tree, dis)): continue
+ sec = AptCnf["tree::%s/%s::Sections" % (tree,dis)].split()[0]
+ if sec != "debian-installer":
+ print "ALERT: weird non debian-installer section in %s" % (tree)
+
+ for arch in AptCnf["tree::%s/%s::Architectures" % (tree,dis)].split():
+ if arch != "source": # always true
+ for file in compressnames("tree::%s/%s" % (tree,dis),
+ "Packages",
+ "%s/%s/binary-%s/Packages" % (dis, sec, arch)):
+ files.append(file)
elif AptCnf.has_key("tree::%s::FakeDI" % (tree)):
usetree = AptCnf["tree::%s::FakeDI" % (tree)]
sec = AptCnf["tree::%s/main::Sections" % (usetree)].split()[0]
print_md5_files(tree, files)
out.write("SHA1:\n")
print_sha1_files(tree, files)
+ out.write("SHA256:\n")
+ print_sha256_files(tree, files)
out.close()
if Cnf.has_key("Dinstall::SigningKeyring"):
return None
status_read, status_write = os.pipe()
- cmd = "gpgv --status-fd %s --keyring %s --keyring %s %s" \
- % (status_write, Cnf["Dinstall::PGPKeyring"], Cnf["Dinstall::GPGKeyring"], filename)
+ cmd = "gpgv --status-fd %s %s %s" \
+ % (status_write, daklib.utils.gpg_keyring_args(), filename)
(output, status, exit_status) = daklib.utils.gpgv_get_status_output(cmd, status_read, status_write)
# Process the status-fd output
prefix = ""
else:
prefix = ""
- component = component.replace("non-US/", "")
if component != 'main':
suffix = '/' + component
else:
def get_or_set_files_id (filename, size, md5sum, location_id):
global files_id_cache, files_id_serial, files_query_cache
- cache_key = "~".join((filename, size, md5sum, repr(location_id)))
+ cache_key = "_".join((filename, size, md5sum, repr(location_id)))
if not files_id_cache.has_key(cache_key):
files_id_serial += 1
files_query_cache.write("%d\t%s\t%s\t%s\t%d\t\\N\n" % (files_id_serial, filename, size, md5sum, location_id))
(md5sum, size, filename) = line.strip().split()
# Don't duplicate .orig.tar.gz's
if filename.endswith(".orig.tar.gz"):
- cache_key = "%s~%s~%s" % (filename, size, md5sum)
+ cache_key = "%s_%s_%s" % (filename, size, md5sum)
if orig_tar_gz_cache.has_key(cache_key):
id = orig_tar_gz_cache[cache_key]
else:
if filename.endswith(".dsc"):
files_id = id
filename = directory + package + '_' + no_epoch_version + '.dsc'
- cache_key = "%s~%s" % (package, version)
+ cache_key = "%s_%s" % (package, version)
if not source_cache.has_key(cache_key):
- nasty_key = "%s~%s" % (package, version)
+ nasty_key = "%s_%s" % (package, version)
source_id_serial += 1
if not source_cache_for_binaries.has_key(nasty_key):
source_cache_for_binaries[nasty_key] = source_id_serial
filename = poolify (filename, location)
if architecture == "all":
filename = re_arch_from_filename.sub("binary-all", filename)
- cache_key = "%s~%s" % (source, source_version)
+ cache_key = "%s_%s" % (source, source_version)
source_id = source_cache_for_binaries.get(cache_key, None)
size = Scanner.Section["size"]
md5sum = Scanner.Section["md5sum"]
files_id = get_or_set_files_id (filename, size, md5sum, location_id)
type = "deb"; # FIXME
- cache_key = "%s~%s~%s~%d~%d~%d~%d" % (package, version, repr(source_id), architecture_id, location_id, files_id, suite_id)
+ cache_key = "%s_%s_%s_%d_%d_%d_%d" % (package, version, repr(source_id), architecture_id, location_id, files_id, suite_id)
if not arch_all_cache.has_key(cache_key):
arch_all_cache[cache_key] = 1
- cache_key = "%s~%s~%s~%d" % (package, version, repr(source_id), architecture_id)
+ cache_key = "%s_%s_%s_%d" % (package, version, repr(source_id), architecture_id)
if not binary_cache.has_key(cache_key):
if not source_id:
source_id = "\N"
def get_ldap_value(entry, value):
ret = entry.get(value)
- if not ret:
+ if not ret or ret[0] == "" or ret[0] == "-":
return ""
else:
# FIXME: what about > 0 ?
- return ret[0]
+ return ret[0] + " "
+
+def get_ldap_name(entry):
+ name = get_ldap_value(entry, "cn")
+ name += get_ldap_value(entry, "mn")
+ name += get_ldap_value(entry, "sn")
+ return name.rstrip()
+
+def escape_string(str):
+ return str.replace("'", "\\'")
def main():
global Cnf, projectB
l.simple_bind_s("","")
Attrs = l.search_s(LDAPDn, ldap.SCOPE_ONELEVEL,
"(&(keyfingerprint=*)(gidnumber=%s))" % (Cnf["Import-Users-From-Passwd::ValidGID"]),
- ["uid", "keyfingerprint"])
+ ["uid", "keyfingerprint", "cn", "mn", "sn"])
projectB.query("BEGIN WORK")
# Sync LDAP with DB
db_fin_uid = {}
+ db_uid_name = {}
ldap_fin_uid_id = {}
q = projectB.query("""
SELECT f.fingerprint, f.id, u.uid FROM fingerprint f, uid u WHERE f.uid = u.id
(fingerprint, fingerprint_id, uid) = i
db_fin_uid[fingerprint] = (uid, fingerprint_id)
+ q = projectB.query("SELECT id, name FROM uid")
+ for i in q.getresult():
+ (uid, name) = i
+ db_uid_name[uid] = name
+
for i in Attrs:
entry = i[1]
fingerprints = entry["keyFingerPrint"]
uid = entry["uid"][0]
+ name = get_ldap_name(entry)
uid_id = daklib.database.get_or_set_uid_id(uid)
+
+ if not db_uid_name.has_key(uid_id) or db_uid_name[uid_id] != name:
+ q = projectB.query("UPDATE uid SET name = '%s' WHERE id = %d" % (escape_string(name), uid_id))
+ print "Assigning name of %s as %s" % (uid, name)
+
for fingerprint in fingerprints:
ldap_fin_uid_id[fingerprint] = (uid, uid_id)
if db_fin_uid.has_key(fingerprint):
if not existing_uid:
q = projectB.query("UPDATE fingerprint SET uid = %s WHERE id = %s" % (uid_id, fingerprint_id))
print "Assigning %s to 0x%s." % (uid, fingerprint)
+ elif existing_uid == uid:
+ pass
+ elif existing_uid[:3] == "dm:":
+ q = projectB.query("UPDATE fingerprint SET uid = %s WHERE id = %s" % (uid_id, fingerprint_id))
+ print "Promoting DM %s to DD %s with keyid 0x%s." % (existing_uid, uid, fingerprint)
else:
- if existing_uid != uid:
- daklib.utils.fubar("%s has %s in LDAP, but projectB says it should be %s." % (uid, fingerprint, existing_uid))
+ daklib.utils.warn("%s has %s in LDAP, but projectB says it should be %s." % (uid, fingerprint, existing_uid))
# Try to update people who sign with non-primary key
q = projectB.query("SELECT fingerprint, id FROM fingerprint WHERE uid is null")
for i in q.getresult():
(fingerprint, fingerprint_id) = i
- cmd = "gpg --no-default-keyring --keyring=%s --keyring=%s --fingerprint %s" \
- % (Cnf["Dinstall::PGPKeyring"], Cnf["Dinstall::GPGKeyring"],
- fingerprint)
+ cmd = "gpg --no-default-keyring %s --fingerprint %s" \
+ % (daklib.utils.gpg_keyring_args(), fingerprint)
(result, output) = commands.getstatusoutput(cmd)
if result == 0:
m = re_gpg_fingerprint.search(output)
primary_key = m.group(1)
primary_key = primary_key.replace(" ","")
if not ldap_fin_uid_id.has_key(primary_key):
- daklib.utils.fubar("0x%s (from 0x%s): no UID found in LDAP" % (primary_key, fingerprint))
- (uid, uid_id) = ldap_fin_uid_id[primary_key]
- q = projectB.query("UPDATE fingerprint SET uid = %s WHERE id = %s" % (uid_id, fingerprint_id))
- print "Assigning %s to 0x%s." % (uid, fingerprint)
+ daklib.utils.warn("0x%s (from 0x%s): no UID found in LDAP" % (primary_key, fingerprint))
+ else:
+ (uid, uid_id) = ldap_fin_uid_id[primary_key]
+ q = projectB.query("UPDATE fingerprint SET uid = %s WHERE id = %s" % (uid_id, fingerprint_id))
+ print "Assigning %s to 0x%s." % (uid, fingerprint)
else:
extra_keyrings = ""
for keyring in Cnf.ValueList("Import-LDAP-Fingerprints::ExtraKeyrings"):
extra_keyrings += " --keyring=%s" % (keyring)
- cmd = "gpg --keyring=%s --keyring=%s %s --list-key %s" \
- % (Cnf["Dinstall::PGPKeyring"], Cnf["Dinstall::GPGKeyring"],
- extra_keyrings, fingerprint)
+ cmd = "gpg %s %s --list-key %s" \
+ % (daklib.utils.gpg_keyring_args(), extra_keyrings, fingerprint)
(result, output) = commands.getstatusoutput(cmd)
if result != 0:
cmd = "gpg --keyserver=%s --allow-non-selfsigned-uid --recv-key %s" % (Cnf["Import-LDAP-Fingerprints::KeyServer"], fingerprint)
guess_uid = "???"
name = " ".join(output.split('\n')[0].split()[3:])
print "0x%s -> %s -> %s" % (fingerprint, name, guess_uid)
+
# FIXME: make me optionally non-interactive
# FIXME: default to the guessed ID
uid = None
uid = None
else:
entry = Attrs[0][1]
- name = " ".join([get_ldap_value(entry, "cn"),
- get_ldap_value(entry, "mn"),
- get_ldap_value(entry, "sn")])
+ name = get_ldap_name(entry)
prompt = "Map to %s - %s (y/N) ? " % (uid, name.replace(" "," "))
yn = daklib.utils.our_raw_input(prompt).lower()
if yn == "y":
prefix = ""
else:
prefix = ""
- component = component.replace("non-US/", "")
if component != "main":
suffix = '/' + component
else:
else:
packages[package] = { "maintainer": maintainer, "priority": suite_priority, "version": version }
- # Process any additional Maintainer files (e.g. from non-US or pseudo packages)
+ # Process any additional Maintainer files (e.g. from pseudo packages)
for filename in extra_files:
file = daklib.utils.open_file(filename)
for line in file.readlines():
lhs = split[0]
maintainer = fix_maintainer(" ".join(split[1:]))
if lhs.find('~') != -1:
- (package, version) = lhs.split('~')
+ (package, version) = lhs.split('~', 1)
else:
package = lhs
version = '*'
if otype == "deb":
suffix = ""
elif otype == "udeb":
- if component != "main":
+ if component == "contrib":
continue; # Ick2
suffix = ".debian-installer"
elif otype == "dsc":
suffix = ".src"
- filename = "%s/override.%s.%s%s" % (Cnf["Dir::Override"], override_suite, component.replace("non-US/", ""), suffix)
+ filename = "%s/override.%s.%s%s" % (Cnf["Dir::Override"], override_suite, component, suffix)
output_file = daklib.utils.open_file(filename, 'w')
do_list(output_file, suite, component, otype)
output_file.close()
-a, --architecture=ARCH only write file lists for this architecture
-c, --component=COMPONENT only write file lists for this component
+ -f, --force ignore Untouchable suite directives in dak.conf
-h, --help show this help and exit
-n, --no-delete don't delete older versions
-s, --suite=SUITE only write file lists for this suite
delete_version = version[0]
delete_id = packages[delete_unique_id]["id"]
delete_arch = packages[delete_unique_id]["arch"]
- if not Cnf.Find("Suite::%s::Untouchable" % (suite)):
+ if not Cnf.Find("Suite::%s::Untouchable" % (suite)) or Options["Force"]:
if Options["No-Delete"]:
print "Would delete %s_%s_%s in %s in favour of %s_%s" % (pkg, delete_arch, delete_version, suite, dominant_version, dominant_arch)
else:
('c', "component", "Make-Suite-File-List::Options::Component", "HasArg"),
('h', "help", "Make-Suite-File-List::Options::Help"),
('n', "no-delete", "Make-Suite-File-List::Options::No-Delete"),
+ ('f', "force", "Make-Suite-File-List::Options::Force"),
('s', "suite", "Make-Suite-File-List::Options::Suite", "HasArg")]
- for i in ["architecture", "component", "help", "no-delete", "suite" ]:
+ for i in ["architecture", "component", "help", "no-delete", "suite", "force-touch" ]:
if not Cnf.has_key("Make-Suite-File-List::Options::%s" % (i)):
Cnf["Make-Suite-File-List::Options::%s" % (i)] = ""
apt_pkg.ParseCommandLine(Cnf,Arguments,sys.argv)
def usage (exit_code=0):
print """Usage: dak override [OPTIONS] package [section] [priority]
-Make microchanges or microqueries of the overrides
+Make microchanges or microqueries of the binary overrides
-h, --help show this help and exit
-d, --done=BUG# send priority/section change as closure to bug#
else:
daklib.utils.fubar("%s is not a valid section or priority" % (arg))
-
# Retrieve current section/priority...
- q = projectB.query("""
- SELECT priority.priority AS prio, section.section AS sect
- FROM override, priority, section, suite
+ oldsection, oldsourcesection, oldpriority = None, None, None
+ for type in ['source', 'binary']:
+ eqdsc = '!='
+ if type == 'source':
+ eqdsc = '='
+ q = projectB.query("""
+ SELECT priority.priority AS prio, section.section AS sect, override_type.type AS type
+ FROM override, priority, section, suite, override_type
WHERE override.priority = priority.id
+ AND override.type = override_type.id
+ AND override_type.type %s 'dsc'
AND override.section = section.id
AND override.package = %s
AND override.suite = suite.id
AND suite.suite_name = %s
- """ % (pg._quote(package,"str"), pg._quote(suite,"str")))
+ """ % (eqdsc, pg._quote(package,"str"), pg._quote(suite,"str")))
- if q.ntuples() == 0:
- daklib.utils.fubar("Unable to find package %s" % (package))
- if q.ntuples() > 1:
- daklib.utils.fubar("%s is ambiguous. Matches %d packages" % (package,q.ntuples()))
+ if q.ntuples() == 0:
+ continue
+ if q.ntuples() > 1:
+ daklib.utils.fubar("%s is ambiguous. Matches %d packages" % (package,q.ntuples()))
+
+ r = q.getresult()
+ if type == 'binary':
+ oldsection = r[0][1]
+ oldpriority = r[0][0]
+ else:
+ oldsourcesection = r[0][1]
- r = q.getresult()
- oldsection = r[0][1]
- oldpriority = r[0][0]
+ if not oldpriority and not oldsourcesection:
+ daklib.utils.fubar("Unable to find package %s" % (package))
+ if oldsection and oldsourcesection and oldsection != oldsourcesection:
+ # When setting overrides, both source & binary will become the same section
+ daklib.utils.warn("Source is in section '%s' instead of '%s'" % (oldsourcesection, oldsection))
+ if not oldsection:
+ oldsection = oldsourcesection
if not arguments:
- print "%s is in section '%s' at priority '%s'" % (
- package,oldsection,oldpriority)
+ if oldpriority:
+ print "%s is in section '%s' at priority '%s'" % (
+ package,oldsection,oldpriority)
+ elif oldsourcesection:
+ # no use printing this line if also binary
+ print "%s is in section '%s'" % (
+ package,oldsourcesection)
sys.exit(0)
# At this point, we have a new section and priority... check they're valid...
print "I: Doing nothing"
sys.exit(0)
+ if newpriority and not oldpriority:
+ daklib.utils.fubar("Trying to set priority of a source-only package")
+
# If we're in no-action mode
if Options["No-Action"]:
if newpriority != oldpriority:
UPDATE override
SET priority=%d
WHERE package=%s
+ AND override.type != %d
AND suite = (SELECT id FROM suite WHERE suite_name=%s)""" % (
newprioid,
- pg._quote(package,"str"),
+ pg._quote(package,"str"), daklib.database.get_override_type_id("dsc"),
pg._quote(suite,"str") ))
Logger.log(["changed priority",package,oldpriority,newpriority])
newsecid,
pg._quote(package,"str"),
pg._quote(suite,"str") ))
- Logger.log(["changed priority",package,oldsection,newsection])
+ Logger.log(["changed section",package,oldsection,newsection])
projectB.query("COMMIT WORK")
if Options.has_key("Done"):
files_id = daklib.database.set_files_id (filename, dsc_files[dsc_file]["size"], dsc_files[dsc_file]["md5sum"], dsc_location_id)
projectB.query("INSERT INTO dsc_files (source, file) VALUES (currval('source_id_seq'), %d)" % (files_id))
+ # Add the src_uploaders to the DB
+ if dsc.get("dm-upload-allowed", "no") == "yes":
+ uploader_ids = [maintainer_id]
+ if dsc.has_key("uploaders"):
+ for u in dsc["uploaders"].split(","):
+ u = u.replace("'", "\\'")
+ u = u.strip()
+ uploader_ids.append(
+ daklib.database.get_or_set_maintainer_id(u))
+ for u in uploader_ids:
+ projectB.query("INSERT INTO src_uploaders (source, maintainer) VALUES (currval('source_id_seq'), %d)" % (u))
+
+
# Add the .deb files to the DB
for file in files.keys():
if files[file]["type"] == "deb":
# Add the binaries to stable (and remove it/them from proposed-updates)
for file in files.keys():
if files[file]["type"] == "deb":
- binNMU = 0
package = files[file]["package"]
version = files[file]["version"]
architecture = files[file]["architecture"]
q = projectB.query("SELECT b.id FROM binaries b, architecture a WHERE b.package = '%s' AND b.version = '%s' AND (a.arch_string = '%s' OR a.arch_string = 'all') AND b.architecture = a.id" % (package, version, architecture))
ql = q.getresult()
if not ql:
- suite_id = daklib.database.get_suite_id('proposed-updates')
- que = "SELECT b.version FROM binaries b JOIN bin_associations ba ON (b.id = ba.bin) JOIN suite su ON (ba.suite = su.id) WHERE b.package = '%s' AND (ba.suite = '%s')" % (package, suite_id)
- q = projectB.query(que)
-
- # Reduce the query results to a list of version numbers
- ql = [ i[0] for i in q.getresult() ]
- if not ql:
- daklib.utils.fubar("[INTERNAL ERROR] couldn't find '%s' (%s for %s architecture) in binaries table." % (package, version, architecture))
- else:
- for x in ql:
- if re.match(re.compile(r"%s((\.0)?\.)|(\+b)\d+$" % re.escape(version)),x):
- binNMU = 1
- break
- if not binNMU:
- binary_id = ql[0][0]
- suite_id = daklib.database.get_suite_id('proposed-updates')
- projectB.query("DELETE FROM bin_associations WHERE suite = '%s' AND bin = '%s'" % (suite_id, binary_id))
- suite_id = daklib.database.get_suite_id('stable')
- projectB.query("INSERT INTO bin_associations (suite, bin) VALUES ('%s', '%s')" % (suite_id, binary_id))
- else:
- del files[file]
+ daklib.utils.fubar("[INTERNAL ERROR] couldn't find '%s' (%s for %s architecture) in binaries table." % (package, version, architecture))
+
+ binary_id = ql[0][0]
+ suite_id = daklib.database.get_suite_id('proposed-updates')
+ projectB.query("DELETE FROM bin_associations WHERE suite = '%s' AND bin = '%s'" % (suite_id, binary_id))
+ suite_id = daklib.database.get_suite_id('stable')
+ projectB.query("INSERT INTO bin_associations (suite, bin) VALUES ('%s', '%s')" % (suite_id, binary_id))
projectB.query("COMMIT WORK")
if not Options["No-Mail"] and changes["architecture"].has_key("source"):
Subst["__SUITE__"] = " into stable"
Subst["__SUMMARY__"] = summary
- mail_message = daklib.utils.TemplateSubst(Subst,Cnf["Dir::Templates"]+"/process-accepted.installed")
+ mail_message = daklib.utils.TemplateSubst(Subst,Cnf["Dir::Templates"]+"/process-accepted.install")
daklib.utils.send_mail(mail_message)
Upload.announce(short_summary, 1)
################################################################################
-re_valid_version = re.compile(r"^([0-9]+:)?[0-9A-Za-z\.\-\+:]+$")
+re_valid_version = re.compile(r"^([0-9]+:)?[0-9A-Za-z\.\-\+:~]+$")
re_valid_pkg_name = re.compile(r"^[\dA-Za-z][\dA-Za-z\+\-\.]+$")
re_changelog_versions = re.compile(r"^\w[-+0-9a-z.]+ \([^\(\) \t]+\)")
re_strip_revision = re.compile(r"-([^-]+)$")
+re_strip_srcver = re.compile(r"\s+\(\S+\)$")
################################################################################
reject("%s: Missing mandatory field `%s'." % (filename, i))
return 0 # Avoid <undef> errors during later tests
+ # Strip a source version in brackets from the source field
+ if re_strip_srcver.search(changes["source"]):
+ changes["source"] = re_strip_srcver.sub('', changes["source"])
+
+ # Ensure the source field is a valid package name.
+ if not re_valid_pkg_name.match(changes["source"]):
+ reject("%s: invalid source name '%s'." % (filename, changes["source"]))
+
# Split multi-value fields into a lower-level dictionary
for i in ("architecture", "distribution", "binary", "closes"):
o = changes.get(i, "")
# Check there isn't already a changes file of the same name in one
# of the queue directories.
base_filename = os.path.basename(filename)
- for dir in [ "Accepted", "Byhand", "Done", "New", "ProposedUpdates" ]:
+ for dir in [ "Accepted", "Byhand", "Done", "New", "ProposedUpdates", "OldProposedUpdates" ]:
if os.path.exists(Cnf["Dir::Queue::%s" % (dir) ]+'/'+base_filename):
reject("%s: a file with this name already exists in the %s directory." % (base_filename, dir))
o control.tar.gz
o data.tar.gz or data.tar.bz2
-in that order, and nothing else. If the third member is a
-data.tar.bz2, an additional check is performed for the required
-Pre-Depends on dpkg (>= 1.10.24)."""
+in that order, and nothing else."""
cmd = "ar t %s" % (filename)
(result, output) = commands.getstatusoutput(cmd)
if result != 0:
reject("%s: first chunk is '%s', expected 'debian-binary'." % (filename, chunks[0]))
if chunks[1] != "control.tar.gz":
reject("%s: second chunk is '%s', expected 'control.tar.gz'." % (filename, chunks[1]))
- if chunks[2] == "data.tar.bz2":
- # Packages using bzip2 compression must have a Pre-Depends on dpkg >= 1.10.24.
- found_needed_predep = 0
- for parsed_dep in apt_pkg.ParseDepends(control.Find("Pre-Depends", "")):
- for atom in parsed_dep:
- (dep, version, constraint) = atom
- if dep != "dpkg" or (constraint != ">=" and constraint != ">>") or \
- len(parsed_dep) > 1: # or'ed deps don't count
- continue
- if (constraint == ">=" and apt_pkg.VersionCompare(version, "1.10.24") < 0) or \
- (constraint == ">>" and apt_pkg.VersionCompare(version, "1.10.23") < 0):
- continue
- found_needed_predep = 1
- if not found_needed_predep:
- reject("%s: uses bzip2 compression, but doesn't Pre-Depend on dpkg (>= 1.10.24)" % (filename))
- elif chunks[2] != "data.tar.gz":
+ if chunks[2] not in [ "data.tar.bz2", "data.tar.gz" ]:
reject("%s: third chunk is '%s', expected 'data.tar.gz' or 'data.tar.bz2'." % (filename, chunks[2]))
################################################################################
for file in file_keys:
# Ensure the file does not already exist in one of the accepted directories
- for dir in [ "Accepted", "Byhand", "New", "ProposedUpdates" ]:
+ for dir in [ "Accepted", "Byhand", "New", "ProposedUpdates", "OldProposedUpdates", "Embargoed", "Unembargoed" ]:
+ if not Cnf.has_key("Dir::Queue::%s" % (dir)): continue
if os.path.exists(Cnf["Dir::Queue::%s" % (dir) ]+'/'+file):
reject("%s file already exists in the %s directory." % (file, dir))
if not daklib.utils.re_taint_free.match(file):
files[file]["type"] = "unreadable"
continue
# If it's byhand skip remaining checks
- if files[file]["section"] == "byhand" or files[file]["section"] == "raw-installer":
+ if files[file]["section"] == "byhand" or files[file]["section"][4:] == "raw-":
files[file]["byhand"] = 1
files[file]["type"] = "byhand"
# Checks for a binary package...
files[file]["new"] = 1
else:
dsc_file_exists = 0
- for myq in ["Accepted", "Embargoed", "Unembargoed"]:
- if os.path.exists(Cnf["Dir::Queue::"+myq] + '/' + dsc_filename):
- dsc_file_exists = 1
- break
+ for myq in ["Accepted", "Embargoed", "Unembargoed", "ProposedUpdates", "OldProposedUpdates"]:
+ if Cnf.has_key("Dir::Queue::%s" % (myq)):
+ if os.path.exists(Cnf["Dir::Queue::"+myq] + '/' + dsc_filename):
+ dsc_file_exists = 1
+ break
if not dsc_file_exists:
reject("no source found for %s %s (%s)." % (source_package, source_version, file))
# Check the version and for file overwrites
m = daklib.utils.re_issource.match(f)
if not m:
reject("%s: %s in Files field not recognised as source." % (dsc_filename, f))
+ continue
type = m.group(3)
if type == "orig.tar.gz" or type == "tar.gz":
has_tar = 1
apt_inst.debExtract(deb_file,tar.callback,"data.tar.gz")
except SystemError, e:
# If we can't find a data.tar.gz, look for data.tar.bz2 instead.
- if not re.match(r"Cannot f[ui]nd chunk data.tar.gz$", str(e)):
+ if not re.search(r"Cannot f[ui]nd chunk data.tar.gz$", str(e)):
raise
deb_file.seek(0)
apt_inst.debExtract(deb_file,tar.callback,"data.tar.bz2")
except:
reject("%s: deb contents timestamp check failed [%s: %s]" % (filename, sys.exc_type, sys.exc_value))
+################################################################################
+
+def lookup_uid_from_fingerprint(fpr):
+ q = Upload.projectB.query("SELECT u.uid, u.name FROM fingerprint f, uid u WHERE f.uid = u.id AND f.fingerprint = '%s'" % (fpr))
+ qs = q.getresult()
+ if len(qs) == 0:
+ return (None, None)
+ else:
+ return qs[0]
+
+def check_signed_by_key():
+ """Ensure the .changes is signed by an authorized uploader."""
+
+ (uid, uid_name) = lookup_uid_from_fingerprint(changes["fingerprint"])
+ if uid_name == None:
+ uid_name = ""
+
+ # match claimed name with actual name:
+ if uid == None:
+ uid, uid_email = changes["fingerprint"], uid
+ may_nmu, may_sponsor = 1, 1
+ # XXX by default new dds don't have a fingerprint/uid in the db atm,
+ # and can't get one in there if we don't allow nmu/sponsorship
+ elif uid[:3] == "dm:":
+ uid_email = uid[3:]
+ may_nmu, may_sponsor = 0, 0
+ else:
+ uid_email = "%s@debian.org" % (uid)
+ may_nmu, may_sponsor = 1, 1
+
+ if uid_email in [changes["maintaineremail"], changes["changedbyemail"]]:
+ sponsored = 0
+ elif uid_name in [changes["maintainername"], changes["changedbyname"]]:
+ sponsored = 0
+ if uid_name == "": sponsored = 1
+ else:
+ sponsored = 1
+
+ if sponsored and not may_sponsor:
+ reject("%s is not authorised to sponsor uploads" % (uid))
+
+ if not sponsored and not may_nmu:
+ source_ids = []
+ check_suites = changes["distribution"].keys()
+ if "unstable" not in check_suites: check_suites.append("unstable")
+ for suite in check_suites:
+ suite_id = daklib.database.get_suite_id(suite)
+ q = Upload.projectB.query("SELECT s.id FROM source s JOIN src_associations sa ON (s.id = sa.source) WHERE s.source = '%s' AND sa.suite = %d" % (changes["source"], suite_id))
+ for si in q.getresult():
+ if si[0] not in source_ids: source_ids.append(si[0])
+
+ print "source_ids: %s" % (",".join([str(x) for x in source_ids]))
+
+ is_nmu = 1
+ for si in source_ids:
+ is_nmu = 1
+ q = Upload.projectB.query("SELECT m.name FROM maintainer m WHERE m.id IN (SELECT maintainer FROM src_uploaders WHERE src_uploaders.source = %s)" % (si))
+ for m in q.getresult():
+ (rfc822, rfc2047, name, email) = daklib.utils.fix_maintainer(m[0])
+ if email == uid_email or name == uid_name:
+ is_nmu=0
+ break
+ if is_nmu:
+ reject("%s may not upload/NMU source package %s" % (uid, changes["source"]))
+
+ for b in changes["binary"].keys():
+ for suite in changes["distribution"].keys():
+ suite_id = daklib.database.get_suite_id(suite)
+ q = Upload.projectB.query("SELECT DISTINCT s.source FROM source s JOIN binaries b ON (s.id = b.source) JOIN bin_associations ba On (b.id = ba.bin) WHERE b.package = '%s' AND ba.suite = %s" % (b, suite_id))
+ for s in q.getresult():
+ if s[0] != changes["source"]:
+ reject("%s may not hijack %s from source package %s in suite %s" % (uid, b, s, suite))
+
+ for file in files.keys():
+ if files[file].has_key("byhand"):
+ reject("%s may not upload BYHAND file %s" % (uid, file))
+ if files[file].has_key("new"):
+ reject("%s may not upload NEW file %s" % (uid, file))
+
+ # The remaining checks only apply to binary-only uploads right now
+ if changes["architecture"].has_key("source"):
+ return
+
+ if not Cnf.Exists("Binary-Upload-Restrictions"):
+ return
+
+ restrictions = Cnf.SubTree("Binary-Upload-Restrictions")
+
+ # If the restrictions only apply to certain components make sure
+ # that the upload is actual targeted there.
+ if restrictions.Exists("Components"):
+ restricted_components = restrictions.SubTree("Components").ValueList()
+ is_restricted = False
+ for file in files:
+ if files[file]["component"] in restricted_components:
+ is_restricted = True
+ break
+ if not is_restricted:
+ return
+
+ # Assuming binary only upload restrictions are in place we then
+ # iterate over suite and architecture checking the key is in the
+ # allowed list. If no allowed list exists for a given suite or
+ # architecture it's assumed to be open to anyone.
+ for suite in changes["distribution"].keys():
+ if not restrictions.Exists(suite):
+ continue
+ for arch in changes["architecture"].keys():
+ if not restrictions.SubTree(suite).Exists(arch):
+ continue
+ allowed_keys = restrictions.SubTree("%s::%s" % (suite, arch)).ValueList()
+ if changes["fingerprint"] not in allowed_keys:
+ base_filename = os.path.basename(pkg.changes_file)
+ reject("%s: not signed by authorised uploader for %s/%s"
+ % (base_filename, suite, arch))
+
################################################################################
################################################################################
# q-unapproved hax0ring
queue_info = {
"New": { "is": is_new, "process": acknowledge_new },
+ "Autobyhand" : { "is" : is_autobyhand, "process": do_autobyhand },
"Byhand" : { "is": is_byhand, "process": do_byhand },
+ "OldStableUpdate" : { "is": is_oldstableupdate,
+ "process": do_oldstableupdate },
"StableUpdate" : { "is": is_stableupdate, "process": do_stableupdate },
"Unembargo" : { "is": is_unembargo, "process": queue_unembargo },
"Embargo" : { "is": is_embargo, "process": queue_embargo },
}
- queues = [ "New", "Byhand" ]
+ queues = [ "New", "Autobyhand", "Byhand" ]
if Cnf.FindB("Dinstall::SecurityQueueHandling"):
queues += [ "Unembargo", "Embargo" ]
else:
- queues += [ "StableUpdate" ]
+ queues += [ "OldStableUpdate", "StableUpdate" ]
(prompt, answer) = ("", "XXX")
if Options["No-Action"] or Options["Automatic"]:
accept(summary, short_summary)
remove_from_unchecked()
elif answer == queuekey:
- queue_info[queue]["process"](summary)
+ queue_info[queue]["process"](summary, short_summary)
remove_from_unchecked()
elif answer == 'Q':
sys.exit(0)
return 0
-def queue_unembargo (summary):
+def queue_unembargo (summary, short_summary):
print "Moving to UNEMBARGOED holding area."
Logger.log(["Moving to unembargoed", pkg.changes_file])
################################################################################
def is_embargo ():
- return 0
+ # if embargoed queues are enabled always embargo
+ return 1
-def queue_embargo (summary):
+def queue_embargo (summary, short_summary):
print "Moving to EMBARGOED holding area."
Logger.log(["Moving to embargoed", pkg.changes_file])
################################################################################
def is_stableupdate ():
- if changes["distribution"].has_key("proposed-updates"):
- return 1
- return 0
+ if not changes["distribution"].has_key("proposed-updates"):
+ return 0
-def do_stableupdate (summary):
+ if not changes["architecture"].has_key("source"):
+ pusuite = daklib.database.get_suite_id("proposed-updates")
+ q = Upload.projectB.query(
+ "SELECT S.source FROM source s JOIN src_associations sa ON (s.id = sa.source) WHERE s.source = '%s' AND s.version = '%s' AND sa.suite = %d" %
+ (changes["source"], changes["version"], pusuite))
+ ql = q.getresult()
+ if ql:
+ # source is already in proposed-updates so no need to hold
+ return 0
+
+ return 1
+
+def do_stableupdate (summary, short_summary):
print "Moving to PROPOSED-UPDATES holding area."
Logger.log(["Moving to proposed-updates", pkg.changes_file]);
################################################################################
+def is_oldstableupdate ():
+ if not changes["distribution"].has_key("oldstable-proposed-updates"):
+ return 0
+
+ if not changes["architecture"].has_key("source"):
+ pusuite = daklib.database.get_suite_id("oldstable-proposed-updates")
+ q = Upload.projectB.query(
+ "SELECT S.source FROM source s JOIN src_associations sa ON (s.id = sa.source) WHERE s.source = '%s' AND s.version = '%s' AND sa.suite = %d" %
+ (changes["source"], changes["version"], pusuite))
+ ql = q.getresult()
+ if ql:
+ # source is already in oldstable-proposed-updates so no need to hold
+ return 0
+
+ return 1
+
+def do_oldstableupdate (summary, short_summary):
+ print "Moving to OLDSTABLE-PROPOSED-UPDATES holding area."
+ Logger.log(["Moving to oldstable-proposed-updates", pkg.changes_file]);
+
+ Upload.dump_vars(Cnf["Dir::Queue::OldProposedUpdates"]);
+ move_to_dir(Cnf["Dir::Queue::OldProposedUpdates"])
+
+ # Check for override disparities
+ Upload.Subst["__SUMMARY__"] = summary;
+ Upload.check_override();
+
+################################################################################
+
+def is_autobyhand ():
+ all_auto = 1
+ any_auto = 0
+ for file in files.keys():
+ if files[file].has_key("byhand"):
+ any_auto = 1
+
+ # filename is of form "PKG_VER_ARCH.EXT" where PKG, VER and ARCH
+ # don't contain underscores, and ARCH doesn't contain dots.
+ # further VER matches the .changes Version:, and ARCH should be in
+ # the .changes Architecture: list.
+ if file.count("_") < 2:
+ all_auto = 0
+ continue
+
+ (pkg, ver, archext) = file.split("_", 2)
+ if archext.count(".") < 1 or changes["version"] != ver:
+ all_auto = 0
+ continue
+
+ ABH = Cnf.SubTree("AutomaticByHandPackages")
+ if not ABH.has_key(pkg) or \
+ ABH["%s::Source" % (pkg)] != changes["source"]:
+ print "not match %s %s" % (pkg, changes["source"])
+ all_auto = 0
+ continue
+
+ (arch, ext) = archext.split(".", 1)
+ if arch not in changes["architecture"]:
+ all_auto = 0
+ continue
+
+ files[file]["byhand-arch"] = arch
+ files[file]["byhand-script"] = ABH["%s::Script" % (pkg)]
+
+ return any_auto and all_auto
+
+def do_autobyhand (summary, short_summary):
+ print "Attempting AUTOBYHAND."
+ byhandleft = 0
+ for file in files.keys():
+ byhandfile = file
+ if not files[file].has_key("byhand"):
+ continue
+ if not files[file].has_key("byhand-script"):
+ byhandleft = 1
+ continue
+
+ os.system("ls -l %s" % byhandfile)
+ result = os.system("%s %s %s %s %s" % (
+ files[file]["byhand-script"], byhandfile,
+ changes["version"], files[file]["byhand-arch"],
+ os.path.abspath(pkg.changes_file)))
+ if result == 0:
+ os.unlink(byhandfile)
+ del files[file]
+ else:
+ print "Error processing %s, left as byhand." % (file)
+ byhandleft = 1
+
+ if byhandleft:
+ do_byhand(summary, short_summary)
+ else:
+ accept(summary, short_summary)
+
+################################################################################
+
def is_byhand ():
for file in files.keys():
if files[file].has_key("byhand"):
return 1
return 0
-def do_byhand (summary):
+def do_byhand (summary, short_summary):
print "Moving to BYHAND holding area."
Logger.log(["Moving to byhand", pkg.changes_file])
return 1
return 0
-def acknowledge_new (summary):
+def acknowledge_new (summary, short_summary):
Subst = Upload.Subst
print "Moving to NEW holding area."
check_md5sums()
check_urgency()
check_timestamps()
+ check_signed_by_key()
Upload.update_subst(reject_message)
action()
except SystemExit:
def header():
print """<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
- <html><head><meta http-equiv="Content-Type" content="text/html; charset=iso8859-1">
+ <html><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8">
<title>Debian NEW and BYHAND Packages</title>
<link type="text/css" rel="stylesheet" href="style.css">
<link rel="shortcut icon" href="http://www.debian.org/favicon.ico">
print "<p class=\"validate\">Package count in <b>%s</b>: <i>%s</i>\n" % (type, source_count)
print "<br>Total Package count: <i>%s</i></p>\n" % (total_count)
-def force_to_latin(s):
- """Forces a string to Latin-1."""
- latin1_s = unicode(s,'utf-8')
- return latin1_s.encode('iso8859-1', 'replace')
-
def table_row(source, version, arch, last_mod, maint, distribution, closes):
print "<td valign=\"top\" class=\"%s\">%s</td>" % (tdclass, source)
print "<td valign=\"top\" class=\"%s\">" % (tdclass)
for vers in version.split():
- print "%s<br>" % (vers)
+ print "<a href=\"/new/%s_%s.html\">%s</a><br>" % (source, vers, vers)
print "</td><td valign=\"top\" class=\"%s\">%s</td><td valign=\"top\" class=\"%s\">" % (tdclass, arch, tdclass)
for dist in distribution:
print "%s<br>" % (dist)
print "</td><td valign=\"top\" class=\"%s\">%s</td>" % (tdclass, last_mod)
(name, mail) = maint.split(":")
- name = force_to_latin(name)
print "<td valign=\"top\" class=\"%s\"><a href=\"http://qa.debian.org/developer.php?login=%s\">%s</a></td>" % (tdclass, mail, name)
print "<td valign=\"top\" class=\"%s\">" % (tdclass)
stats = {}
file = daklib.utils.open_file("2001-11")
for line in file.readlines():
- split = line.strip().split('~')
+ split = line.strip().split('|')
program = split[1]
if program != "katie" and program != "process-accepted":
continue
# FIXME: ugly hacks to work around override brain damage
section = re_strip_section_prefix.sub('', section)
- section = section.lower().replace('non-us', '')
if section == "main" or section == "contrib" or section == "non-free":
section = ''
if section != '':
component_id_cache = {}
location_id_cache = {}
maintainer_id_cache = {}
+keyring_id_cache = {}
source_id_cache = {}
files_id_cache = {}
maintainer_cache = {}
def get_location_id (location, component, archive):
global location_id_cache
- cache_key = location + '~' + component + '~' + location
+ cache_key = location + '_' + component + '_' + location
if location_id_cache.has_key(cache_key):
return location_id_cache[cache_key]
def get_source_id (source, version):
global source_id_cache
- cache_key = source + '~' + version + '~'
+ cache_key = source + '_' + version + '_'
if source_id_cache.has_key(cache_key):
return source_id_cache[cache_key]
################################################################################
+def get_or_set_keyring_id (keyring):
+ global keyring_id_cache
+
+ if keyring_id_cache.has_key(keyring):
+ return keyring_id_cache[keyring]
+
+ q = projectB.query("SELECT id FROM keyrings WHERE name = '%s'" % (keyring))
+ if not q.getresult():
+ projectB.query("INSERT INTO keyrings (name) VALUES ('%s')" % (keyring))
+ q = projectB.query("SELECT id FROM keyrings WHERE name = '%s'" % (keyring))
+ keyring_id = q.getresult()[0][0]
+ keyring_id_cache[keyring] = keyring_id
+
+ return keyring_id
+
+################################################################################
+
def get_or_set_uid_id (uid):
global uid_id_cache
def get_files_id (filename, size, md5sum, location_id):
global files_id_cache
- cache_key = "%s~%d" % (filename, location_id)
+ cache_key = "%s_%d" % (filename, location_id)
if files_id_cache.has_key(cache_key):
return files_id_cache[cache_key]
##
##q = projectB.query("SELECT id FROM files WHERE id = currval('files_id_seq')")
##ql = q.getresult()[0]
- ##cache_key = "%s~%d" % (filename, location_id)
+ ##cache_key = "%s_%d" % (filename, location_id)
##files_id_cache[cache_key] = ql[0]
##return files_id_cache[cache_key]
if not os.path.exists(logdir):
umask = os.umask(00000)
os.makedirs(logdir, 02775)
+ os.umask(umask)
# Open the logfile
logfilename = "%s/%s" % (logdir, time.strftime("%Y-%m"))
logfile = None
if debug:
logfile = sys.stderr
else:
+ umask = os.umask(00002)
logfile = utils.open_file(logfilename, 'a')
+ os.umask(umask)
self.logfile = logfile
# Log the start of the program
user = pwd.getpwuid(os.getuid())[0]
re_fdnic = re.compile(r"\n\n")
re_bin_only_nmu = re.compile(r"\+b\d+$")
+################################################################################
+
+# Determine what parts in a .changes are NEW
+
+def determine_new(changes, files, projectB, warn=1):
+ new = {}
+
+ # Build up a list of potentially new things
+ for file in files.keys():
+ f = files[file]
+ # Skip byhand elements
+ if f["type"] == "byhand":
+ continue
+ pkg = f["package"]
+ priority = f["priority"]
+ section = f["section"]
+ type = get_type(f)
+ component = f["component"]
+
+ if type == "dsc":
+ priority = "source"
+ if not new.has_key(pkg):
+ new[pkg] = {}
+ new[pkg]["priority"] = priority
+ new[pkg]["section"] = section
+ new[pkg]["type"] = type
+ new[pkg]["component"] = component
+ new[pkg]["files"] = []
+ else:
+ old_type = new[pkg]["type"]
+ if old_type != type:
+ # source gets trumped by deb or udeb
+ if old_type == "dsc":
+ new[pkg]["priority"] = priority
+ new[pkg]["section"] = section
+ new[pkg]["type"] = type
+ new[pkg]["component"] = component
+ new[pkg]["files"].append(file)
+ if f.has_key("othercomponents"):
+ new[pkg]["othercomponents"] = f["othercomponents"]
+
+ for suite in changes["suite"].keys():
+ suite_id = database.get_suite_id(suite)
+ for pkg in new.keys():
+ component_id = database.get_component_id(new[pkg]["component"])
+ type_id = database.get_override_type_id(new[pkg]["type"])
+ q = projectB.query("SELECT package FROM override WHERE package = '%s' AND suite = %s AND component = %s AND type = %s" % (pkg, suite_id, component_id, type_id))
+ ql = q.getresult()
+ if ql:
+ for file in new[pkg]["files"]:
+ if files[file].has_key("new"):
+ del files[file]["new"]
+ del new[pkg]
+
+ if warn:
+ if changes["suite"].has_key("stable"):
+ print "WARNING: overrides will be added for stable!"
+ if changes["suite"].has_key("oldstable"):
+ print "WARNING: overrides will be added for OLDstable!"
+ for pkg in new.keys():
+ if new[pkg].has_key("othercomponents"):
+ print "WARNING: %s already present in %s distribution." % (pkg, new[pkg]["othercomponents"])
+
+ return new
+
+################################################################################
+
+def get_type(f):
+ # Determine the type
+ if f.has_key("dbtype"):
+ type = f["dbtype"]
+ elif f["type"] in [ "orig.tar.gz", "orig.tar.bz2", "tar.gz", "tar.bz2", "diff.gz", "diff.bz2", "dsc" ]:
+ type = "dsc"
+ else:
+ fubar("invalid type (%s) for new. Dazed, confused and sure as heck not continuing." % (type))
+
+ # Validate the override type
+ type_id = database.get_override_type_id(type)
+ if type_id == -1:
+ fubar("invalid type (%s) for new. Say wha?" % (type))
+
+ return type
+
+################################################################################
+
+# check if section/priority values are valid
+
+def check_valid(new):
+ for pkg in new.keys():
+ section = new[pkg]["section"]
+ priority = new[pkg]["priority"]
+ type = new[pkg]["type"]
+ new[pkg]["section id"] = database.get_section_id(section)
+ new[pkg]["priority id"] = database.get_priority_id(new[pkg]["priority"])
+ # Sanity checks
+ di = section.find("debian-installer") != -1
+ if (di and type != "udeb") or (not di and type == "udeb"):
+ new[pkg]["section id"] = -1
+ if (priority == "source" and type != "dsc") or \
+ (priority != "source" and type == "dsc"):
+ new[pkg]["priority id"] = -1
+
+
###############################################################################
# Convenience wrapper to carry around all the package information in
###############################################################################
-class nmu_p:
- # Read in the group maintainer override file
- def __init__ (self, Cnf):
- self.group_maint = {}
- self.Cnf = Cnf
- if Cnf.get("Dinstall::GroupOverrideFilename"):
- filename = Cnf["Dir::Override"] + Cnf["Dinstall::GroupOverrideFilename"]
- file = utils.open_file(filename)
- for line in file.readlines():
- line = utils.re_comments.sub('', line).lower().strip()
- if line != "":
- self.group_maint[line] = 1
- file.close()
-
- def is_an_nmu (self, pkg):
- Cnf = self.Cnf
- changes = pkg.changes
- dsc = pkg.dsc
-
- i = utils.fix_maintainer (dsc.get("maintainer",
- Cnf["Dinstall::MyEmailAddress"]).lower())
- (dsc_rfc822, dsc_rfc2047, dsc_name, dsc_email) = i
- # changes["changedbyname"] == dsc_name is probably never true, but better safe than sorry
- if dsc_name == changes["maintainername"].lower() and \
- (changes["changedby822"] == "" or changes["changedbyname"].lower() == dsc_name):
- return 0
-
- if dsc.has_key("uploaders"):
- uploaders = dsc["uploaders"].lower().split(",")
- uploadernames = {}
- for i in uploaders:
- (rfc822, rfc2047, name, email) = utils.fix_maintainer (i.strip())
- uploadernames[name] = ""
- if uploadernames.has_key(changes["changedbyname"].lower()):
- return 0
-
- # Some group maintained packages (e.g. Debian QA) are never NMU's
- if self.group_maint.has_key(changes["maintaineremail"].lower()):
- return 0
-
- return 1
-
-###############################################################################
-
class Upload:
def __init__(self, Cnf):
self.Cnf = Cnf
- # Read in the group-maint override file
- self.nmu = nmu_p(Cnf)
self.accept_count = 0
self.accept_bytes = 0L
self.pkg = Pkg(changes = {}, dsc = {}, dsc_files = {}, files = {},
d_changes[i] = changes[i]
## dsc
for i in [ "source", "version", "maintainer", "fingerprint",
- "uploaders", "bts changelog" ]:
+ "uploaders", "bts changelog", "dm-upload-allowed" ]:
if dsc.has_key(i):
d_dsc[i] = dsc[i]
## dsc_files
if not changes.has_key("distribution") or not isinstance(changes["distribution"], DictType):
changes["distribution"] = {}
+ override_summary ="";
file_keys = files.keys()
file_keys.sort()
for file in file_keys:
files[file]["pool name"] = utils.poolify (changes.get("source",""), files[file]["component"])
destination = self.Cnf["Dir::PoolRoot"] + files[file]["pool name"] + file
summary += file + "\n to " + destination + "\n"
+ if not files[file].has_key("type"):
+ files[file]["type"] = "unknown"
+ if files[file]["type"] in ["deb", "udeb", "dsc"]:
+ # (queue/unchecked), there we have override entries already, use them
+ # (process-new), there we dont have override entries, use the newly generated ones.
+ override_prio = files[file].get("override priority", files[file]["priority"])
+ override_sect = files[file].get("override section", files[file]["section"])
+ override_summary += "%s - %s %s\n" % (file, override_prio, override_sect)
short_summary = summary
if byhand or new:
summary += "Changes: " + f
+ summary += "\n\nOverride entries for your package:\n" + override_summary + "\n"
+
summary += self.announce(short_summary, 0)
return (summary, short_summary)
return summary
bugs.sort()
- if not self.nmu.is_an_nmu(self.pkg):
- if changes["distribution"].has_key("experimental"):
- # tag bugs as fixed-in-experimental for uploads to experimental
- summary += "Setting bugs to severity fixed: "
- control_message = ""
- for bug in bugs:
- summary += "%s " % (bug)
- control_message += "tag %s + fixed-in-experimental\n" % (bug)
- if action and control_message != "":
- Subst["__CONTROL_MESSAGE__"] = control_message
- mail_message = utils.TemplateSubst(Subst,Cnf["Dir::Templates"]+"/process-unchecked.bug-experimental-fixed")
- utils.send_mail (mail_message)
- if action:
- self.Logger.log(["setting bugs to fixed"]+bugs)
-
-
- else:
- summary += "Closing bugs: "
- for bug in bugs:
- summary += "%s " % (bug)
- if action:
- Subst["__BUG_NUMBER__"] = bug
- if changes["distribution"].has_key("stable"):
- Subst["__STABLE_WARNING__"] = """
+ summary += "Closing bugs: "
+ for bug in bugs:
+ summary += "%s " % (bug)
+ if action:
+ Subst["__BUG_NUMBER__"] = bug
+ if changes["distribution"].has_key("stable"):
+ Subst["__STABLE_WARNING__"] = """
Note that this package is not part of the released stable Debian
distribution. It may have dependencies on other unreleased software,
or other instabilities. Please take care if you wish to install it.
The update will eventually make its way into the next released Debian
distribution."""
- else:
- Subst["__STABLE_WARNING__"] = ""
- mail_message = utils.TemplateSubst(Subst,Cnf["Dir::Templates"]+"/process-unchecked.bug-close")
- utils.send_mail (mail_message)
- if action:
- self.Logger.log(["closing bugs"]+bugs)
-
- else: # NMU
- summary += "Setting bugs to severity fixed: "
- control_message = ""
- for bug in bugs:
- summary += "%s " % (bug)
- control_message += "tag %s + fixed\n" % (bug)
- if action and control_message != "":
- Subst["__CONTROL_MESSAGE__"] = control_message
- mail_message = utils.TemplateSubst(Subst,Cnf["Dir::Templates"]+"/process-unchecked.bug-nmu-fixed")
- utils.send_mail (mail_message)
- if action:
- self.Logger.log(["setting bugs to fixed"]+bugs)
+ else:
+ Subst["__STABLE_WARNING__"] = ""
+ mail_message = utils.TemplateSubst(Subst,Cnf["Dir::Templates"]+"/process-unchecked.bug-close")
+ utils.send_mail (mail_message)
+ if action:
+ self.Logger.log(["closing bugs"]+bugs)
summary += "\n"
+
return summary
###########################################################################
section = files[file]["section"]
override_section = files[file]["override section"]
if section.lower() != override_section.lower() and section != "-":
- # Ignore this; it's a common mistake and not worth whining about
- if section.lower() == "non-us/main" and override_section.lower() == "non-us":
- continue
summary += "%s: package says section is %s, override says %s.\n" % (file, section, override_section)
priority = files[file]["priority"]
override_priority = files[file]["override priority"]
component_id = database.get_component_id(component)
type_id = database.get_override_type_id(type)
- # FIXME: nasty non-US speficic hack
- if component.lower().startswith("non-us/"):
- component = component[7:]
-
q = self.projectB.query("SELECT s.section, p.priority FROM override o, section s, priority p WHERE package = '%s' AND suite = %s AND component = %s AND type = %s AND o.section = s.id AND o.priority = p.id"
% (package, suite_id, component_id, type_id))
result = q.getresult()
# the .orig.tar.gz is a duplicate of the one in the archive]; if
# you're iterating over 'files' and call this function as part of
# the loop, be sure to add a check to the top of the loop to
- # ensure you haven't just tried to derefernece the deleted entry.
+ # ensure you haven't just tried to dereference the deleted entry.
# **WARNING**
def check_dsc_against_db(self, file):
# Try and find all files mentioned in the .dsc. This has
# to work harder to cope with the multiple possible
# locations of an .orig.tar.gz.
+ # The ordering on the select is needed to pick the newest orig
+ # when it exists in multiple places.
for dsc_file in dsc_files.keys():
found = None
if files.has_key(dsc_file):
actual_size = int(files[dsc_file]["size"])
found = "%s in incoming" % (dsc_file)
# Check the file does not already exist in the archive
- q = self.projectB.query("SELECT f.size, f.md5sum, l.path, f.filename FROM files f, location l WHERE f.filename LIKE '%%%s%%' AND l.id = f.location" % (dsc_file))
+ q = self.projectB.query("SELECT f.size, f.md5sum, l.path, f.filename FROM files f, location l WHERE f.filename LIKE '%%%s%%' AND l.id = f.location ORDER BY f.id DESC" % (dsc_file))
ql = q.getresult()
# Strip out anything that isn't '%s' or '/%s$'
for i in ql:
in_unchecked = os.path.join(self.Cnf["Dir::Queue::Unchecked"],dsc_file)
# See process_it() in 'dak process-unchecked' for explanation of this
- if os.path.exists(in_unchecked):
+ # in_unchecked check dropped by ajt 2007-08-28, how did that
+ # ever make sense?
+ if os.path.exists(in_unchecked) and False:
return (self.reject_message, in_unchecked)
else:
- for dir in [ "Accepted", "New", "Byhand" ]:
+ for dir in [ "Accepted", "New", "Byhand", "ProposedUpdates", "OldProposedUpdates" ]:
in_otherdir = os.path.join(self.Cnf["Dir::Queue::%s" % (dir)],dsc_file)
if os.path.exists(in_otherdir):
in_otherdir_fh = utils.open_file(in_otherdir)
re_parse_maintainer = re.compile(r"^\s*(\S.*\S)\s*\<([^\>]+)\>")
+re_srchasver = re.compile(r"^(\S+)\s+\((\S+)\)$")
+
changes_parse_error_exc = "Can't parse line in .changes file"
invalid_dsc_format_exc = "Invalid .dsc file"
nk_format_exc = "Unknown Format: in .changes file"
if section.find('/') != -1:
component = section.split('/')[0]
- if component.lower() == "non-us" and section.find('/') != -1:
- s = component + '/' + section.split('/')[1]
- if Cnf.has_key("Component::%s" % s): # Avoid e.g. non-US/libs
- component = s
-
- if section.lower() == "non-us":
- component = "non-US/main"
-
- # non-US prefix is case insensitive
- if component.lower()[:6] == "non-us":
- component = "non-US"+component[6:]
# Expand default component
if component == "":
component = section
else:
component = "main"
- elif component == "non-US":
- component = "non-US/main"
return (section, component)
changes_in.close()
changes["filecontents"] = "".join(lines)
+ if changes.has_key("source"):
+ # Strip the source version in brackets from the source field,
+ # put it in the "source-version" field instead.
+ srcver = re_srchasver.search(changes["source"])
+ if srcver:
+ changes["source"] = srcver.group(1)
+ changes["source-version"] = srcver.group(2)
+
if error:
raise changes_parse_error_exc, error
def poolify (source, component):
if component:
component += '/'
- # FIXME: this is nasty
- component = component.lower().replace("non-us/", "non-US/")
if source[:3] == "lib":
return component + source[:4] + '/' + source + '/'
else:
internal_error += "gpgv status line is malformed (incorrect prefix '%s').\n" % (gnupg)
continue
args = split[2:]
- if keywords.has_key(keyword) and (keyword != "NODATA" and keyword != "SIGEXPIRED"):
+ if keywords.has_key(keyword) and keyword not in [ "NODATA", "SIGEXPIRED", "KEYEXPIRED" ]:
internal_error += "found duplicate status token ('%s').\n" % (keyword)
continue
else:
if not keyserver:
keyserver = Cnf["Dinstall::KeyServer"]
if not keyring:
- keyring = Cnf["Dinstall::GPGKeyring"]
+ keyring = Cnf.ValueList("Dinstall::GPGKeyring")[0]
# Ensure the filename contains no shell meta-characters or other badness
if not re_taint_free.match(filename):
################################################################################
+def gpg_keyring_args(keyrings=None):
+ if not keyrings:
+ keyrings = Cnf.ValueList("Dinstall::GPGKeyring")
+
+ return " ".join(["--keyring %s" % x for x in keyrings])
+
+################################################################################
+
def check_signature (sig_filename, reject, data_filename="", keyrings=None, autofetch=None):
"""Check the signature of a file and return the fingerprint if the
signature is valid or 'None' if it's not. The first argument is the
return None
if not keyrings:
- keyrings = (Cnf["Dinstall::PGPKeyring"], Cnf["Dinstall::GPGKeyring"])
+ keyrings = Cnf.ValueList("Dinstall::GPGKeyring")
# Autofetch the signing key if that's enabled
if autofetch == None:
# Build the command line
status_read, status_write = os.pipe();
- cmd = "gpgv --status-fd %s" % (status_write)
- for keyring in keyrings:
- cmd += " --keyring %s" % (keyring)
- cmd += " %s %s" % (sig_filename, data_filename)
+ cmd = "gpgv --status-fd %s %s %s %s" % (
+ status_write, gpg_keyring_args(keyrings), sig_filename, data_filename)
+
# Invoke gpgv on the file
(output, status, exit_status) = gpgv_get_status_output(cmd, status_read, status_write)
bad = ""
# Now check for obviously bad things in the processed output
- if keywords.has_key("SIGEXPIRED"):
- reject("The key used to sign %s has expired." % (sig_filename))
- bad = 1
if keywords.has_key("KEYREVOKED"):
reject("The key used to sign %s has been revoked." % (sig_filename))
bad = 1
if keywords.has_key("NODATA"):
reject("no signature found in %s." % (sig_filename))
bad = 1
+ if keywords.has_key("KEYEXPIRED") and not keywords.has_key("GOODSIG"):
+ reject("The key (0x%s) used to sign %s has expired." % (key, sig_filename))
+ bad = 1
if bad:
return None
# Finally ensure there's not something we don't recognise
known_keywords = Dict(VALIDSIG="",SIG_ID="",GOODSIG="",BADSIG="",ERRSIG="",
SIGEXPIRED="",KEYREVOKED="",NO_PUBKEY="",BADARMOR="",
- NODATA="")
+ NODATA="",NOTATION_DATA="",NOTATION_NAME="",KEYEXPIRED="")
for keyword in keywords.keys():
if not known_keywords.has_key(keyword):
-----------
o Usernames do not contain ",". [dak import-users-from-passwd]
-o Package names do not contain "~" [dak cruft-report]
+o Package names and versions do not contain "_" [dak cruft-report]
o Suites are case-independent in conf files, but forced lower case in use. [dak make-suite-file-list]
o Components are case-sensitive. [dak make-suite-file-list]
o There's always source of some sort
| Dinstall
| {
-| PGPKeyring "/org/keyring.debian.org/keyrings/debian-keyring.pgp";
-| GPGKeyring "/org/keyring.debian.org/keyrings/debian-keyring.gpg";
+| GPGKeyring {
+| "/org/keyring.debian.org/keyrings/debian-keyring.gpg";
+| "/org/keyring.debian.org/keyrings/debian-keyring.pgp";
+| };
| SigningKeyring "/org/ftp.debian.org/s3kr1t/dot-gnupg/secring.gpg";
| SendmailCommand "/usr/sbin/sendmail -odq -oi -t";
| MyEmailAddress "Debian Installer <installer@ftp-master.debian.org>";
| TrackingServer "packages.qa.debian.org";
| LockFile "/org/ftp.debian.org/dak/lock";
| Bcc "archive@ftp-master.debian.org";
-| GroupOverrideFilename "override.group-maint";
| FutureTimeTravelGrace 28800; // 8 hours
| PastCutoffYear "1984";
| BXANotify "false";
| };
| };
-PGPKeyring and GPGKeyring (required): filenames of the PGP and GnuPG
-keyrings to be used by dak respectively.
+GPGKeyring (required): filenames of the PGP and GnuPG
+keyrings to be used by dak.
SigningKeyring (optional): this is the private keyring used by 'dak
generate-releases'.
All sent mail is blind carbon copied to the email address in Bcc if it's
not blank.
-GroupOverrideFilename (optional): this is the override file which contains
-the list of email addresses which, if part of the Maintainer field, cause
-uploads to always be treated as maintainer uploads.
-
FutureTimeTravelGrace (required): specifies how many seconds into the
future timestamps are allowed to be inside a deb before being rejected.
o Do anything in proposed-updates/TODO
o Close any applicable stable bugs
- (hint: http://bugs.debian.org/cgi-bin/pkgreport.cgi?pkg=ftp.debian.org&include=woody)
+ (hint: http://bugs.debian.org/cgi-bin/pkgreport.cgi?pkg=ftp.debian.org&include=etch)
o Update version number in README, README.html and dists/README (ftp-master only)
o Update the 'Debian<n>.<n>r<n>' symlink in dists/
-o Clean up dists/ChangeLog (add header, basically)
-o Update version fields in dak.conf[-non-US]
+o Clean up dists/stable/ChangeLog (add header, basically)
+o Update version fields in dak.conf
o Update fields in suite table in postgresql (see below)
-o Comment out "Untouchable" in dak.conf[-non-US]
+o Comment out "Untouchable" in dak.conf
o Run 'dak make-suite-file-list -s stable'
-o Run apt-ftparchive generate apt.conf.stable[-non-US]
+o Run apt-ftparchive generate apt.conf.stable
o Run 'dak generate-releases stable' ** FIXME: requires apt.conf.stable stanza for stable in apt.conf
** FIXME: must be run as dak
-o Uncomment "Untouchable" in dak.conf[-non-US]
+o Uncomment "Untouchable" in dak.conf
Yes, this sucks and more of it should be automated.
#######################################################
-update suite set version = '3.0r4' where suite_name = 'stable';
-update suite set description = 'Debian 3.0r4 Released 31st December 2004' where suite_name = 'stable';
+update suite set version = '4.0r1' where suite_name = 'stable';
+update suite set description = 'Debian 4.0r1 Released 15th August 2007' where suite_name = 'stable';
#!/bin/sh -e
+export SCRIPTVARS=/srv/ftp.debian.org/dak/config/debian/vars
+. $SCRIPTVARS
umask 002
-cd /org/ftp.debian.org/ftp/indices/files/components
+cd $base/ftp/indices/files/components
ARCHLIST=$(tempfile)
echo "Querying projectb..."
-echo 'SELECT l.path, f.filename, a.arch_string FROM location l JOIN files f ON (f.location = l.id) LEFT OUTER JOIN (binaries b JOIN architecture a ON (b.architecture = a.id)) ON (f.id = b.file)' | psql projectb -At | sed 's/|//;s/|all$/|/;s,^/org/ftp.debian.org/ftp,.,' | sort >$ARCHLIST
+echo 'SELECT l.path, f.filename, a.arch_string FROM location l JOIN files f ON (f.location = l.id) LEFT OUTER JOIN (binaries b JOIN architecture a ON (b.architecture = a.id)) ON (f.id = b.file)' | psql projectb -At | sed 's/|//;s/|all$/|/;s,^/srv/ftp.debian.org/ftp,.,' | sort >$ARCHLIST
includedirs () {
perl -ne 'print; while (m,/[^/]+$,) { $_=$`; print $_ . "\n" unless $d{$_}++; }'
(
sed -n 's/|$//p' $ARCHLIST
- cd /org/ftp.debian.org/ftp
+ cd $base/ftp
find ./dists -maxdepth 1 \! -type d
find ./dists \! -type d | grep "/source/"
) | sort -u | gzip -9 > source.list.gz
for a in $ARCHES; do
(sed -n "s/|$a$//p" $ARCHLIST
- cd /org/ftp.debian.org/ftp;
+ cd $base/ftp
find ./dists -maxdepth 1 \! -type d
find ./dists \! -type d | grep -E "(proposed-updates.*_$a.changes$|/main/disks-$a/|/main/installer-$a/|/Contents-$a|/binary-$a/)"
if echo X sparc mips mipsel hppa X | grep -q " $a "; then
printf 'SELECT id, suite_name FROM suite\n' | psql -F' ' -At projectb |
while read id suite; do
- [ -e /org/ftp.debian.org/ftp/dists/$suite ] || continue
+ [ -e $base/ftp/dists/$suite ] || continue
(
- (cd /org/ftp.debian.org/ftp;
+ (cd $base/ftp
distname=$(cd dists; readlink $suite || echo $suite)
find ./dists/$distname \! -type d
for distdir in ./dists/*; do
[ "$(readlink $distdir)" != "$distname" ] || echo $distdir
done
)
- suite_list $id | tr -d ' ' | sed 's,^/org/ftp.debian.org/ftp,.,'
+ suite_list $id | tr -d ' ' | sed 's,^/srv/ftp.debian.org/ftp,.,'
) | sort -u | gzip -9 > suite-${suite}.list.gz
done
echo "Finding everything on the ftp site to generate sundries $(date +"%X")..."
-(cd /org/ftp.debian.org/ftp; find . \! -type d \! -name 'Archive_Maintenance_In_Progress' | sort) >$ARCHLIST
+(cd $base/ftp; find . \! -type d \! -name 'Archive_Maintenance_In_Progress' | sort) >$ARCHLIST
rm -f sundries.list
zcat *.list.gz | cat - *.list | sort -u |
sort -u | poolfirst > ../arch-$a.files
done
-(cat ../arch-i386.files ../arch-amd64.files; zcat suite-stable.list.gz) |
+(cat ../arch-i386.files ../arch-amd64.files; zcat suite-oldstable.list.gz suite-proposed-updates.list.gz) |
sort -u | poolfirst > ../typical.files
rm -f $ARCHLIST
. $SCRIPTVARS
cd $base/misc/
-nonusmaint="$base/misc/Maintainers_Versions-non-US"
-
-
-if wget -T15 -q -O Maintainers_Versions-non-US.gz http://non-us.debian.org/indices-non-US/Maintainers_Versions.gz; then
- rm -f $nonusmaint
- gunzip -c ${nonusmaint}.gz > $nonusmaint
- rm -f ${nonusmaint}.gz
-fi
-
cd $indices
-dak make-maintainers $nonusmaint $configdir/pseudo-packages.maintainers | sed -e "s/~[^ ]*\([ ]\)/\1/" | awk '{printf "%-20s ", $1; for (i=2; i<=NF; i++) printf "%s ", $i; printf "\n";}' > .new-maintainers
+dak make-maintainers $configdir/pseudo-packages.maintainers | sed -e "s/~[^ ]*\([ ]\)/\1/" | awk '{printf "%-20s ", $1; for (i=2; i<=NF; i++) printf "%s ", $i; printf "\n";}' > .new-maintainers
set +e
cmp .new-maintainers Maintainers >/dev/null
"sparc");
while (<>) {
- if (/^(\d{8})\d{6}\|k(?:atie|elly)\|installed\|[^|]+\|[^|]+\|(\d+)\|([-\w]+)$/) {
+ if (/^(\d{8})\d{6}\|(?:k(?:atie|elly)|process-accepted)\|installed\|[^|]+\|[^|]+\|(\d+)\|([-\w]+)$/) {
if (not defined $data{$1}) {
foreach $a (@archs) {
$data{$1}{$a} = 0;
rm -f $ftpdir/README.mirrors.html $ftpdir/README.mirrors.txt
$prog -m $masterlist -t html > $ftpdir/README.mirrors.html
$prog -m $masterlist -t text > $ftpdir/README.mirrors.txt
- if [ ! -f $ftpdir/README.non-US -o $masterlist -nt $ftpdir/README.non-US ] ; then
- rm -f $ftpdir/README.non-US
- $prog -m $masterlist -t nonus > $ftpdir/README.non-US
- install -m 664 $ftpdir/README.non-US $webdir
- fi
echo Updated archive version of mirrors file
fi
archive_id_seq, bin_associations, bin_associations_id_seq, binaries,
binaries_id_seq, component, component_id_seq, dsc_files,
dsc_files_id_seq, files, files_id_seq, fingerprint,
- fingerprint_id_seq, location, location_id_seq, maintainer,
+ fingerprint_id_seq, keyrings, keyrings_id_seq,
+ location, location_id_seq, maintainer,
maintainer_id_seq, override, override_type, override_type_id_seq,
priority, priority_id_seq, section, section_id_seq, source,
- source_id_seq, src_associations, src_associations_id_seq, suite,
+ source_id_seq, src_uploaders, src_uploaders_id_seq,
+ src_associations, src_associations_id_seq, suite,
suite_architectures, suite_id_seq, queue_build, uid,
uid_id_seq TO GROUP ftpmaster;
archive_id_seq, bin_associations, bin_associations_id_seq, binaries,
binaries_id_seq, component, component_id_seq, dsc_files,
dsc_files_id_seq, files, files_id_seq, fingerprint,
- fingerprint_id_seq, location, location_id_seq, maintainer,
+ fingerprint_id_seq, keyrings, keyrings_id_seq,
+ location, location_id_seq, maintainer,
maintainer_id_seq, override, override_type, override_type_id_seq,
priority, priority_id_seq, section, section_id_seq, source,
- source_id_seq, src_associations, src_associations_id_seq, suite,
+ source_id_seq, src_uploaders, src_uploaders_id_seq,
+ src_associations, src_associations_id_seq, suite,
suite_architectures, suite_id_seq, queue_build, uid,
uid_id_seq TO PUBLIC;
name TEXT UNIQUE NOT NULL
);
+CREATE TABLE src_uploaders (
+ id SERIAL PRIMARY KEY,
+ source INT4 NOT NULL REFERENCES source,
+ maintainer INT4 NOT NULL REFERENCES maintainer
+);
+
CREATE TABLE uid (
id SERIAL PRIMARY KEY,
- uid TEXT UNIQUE NOT NULL
+ uid TEXT UNIQUE NOT NULL,
+ name TEXT
);
+CREATE TABLE keyrings (
+ id SERIAL PRIMARY KEY,
+ name TEXT
+);
+
+
CREATE TABLE fingerprint (
id SERIAL PRIMARY KEY,
fingerprint TEXT UNIQUE NOT NULL,
- uid INT4 REFERENCES uid
+ uid INT4 REFERENCES uid,
+ keyring INT4 REFERENCES keyrings
);
CREATE TABLE location (
sql-aptvc.o: sql-aptvc.cpp
sql-aptvc.so: sql-aptvc.o
- $(C++) $(LDFLAGS) $(LIBS) -shared -o $@ $<
+ $(CC) $(LDFLAGS) $(LIBS) -shared -o $@ $<
clean:
rm -f sql-aptvc.so sql-aptvc.o
To: __BUG_NUMBER__-close@__BUG_SERVER__
__CC__
__BCC__
+MIME-Version: 1.0
+Content-Type: text/plain; charset="utf-8"
+Content-Transfer-Encoding: 8bit
Subject: Bug#__BUG_NUMBER__: fixed
We believe that the bug you reported is now fixed; the following
To: __MAINTAINER_TO__
__BCC__
Precedence: bulk
+MIME-Version: 1.0
+Content-Type: text/plain; charset="utf-8"
+Content-Transfer-Encoding: 8bit
Subject: __CHANGES_FILENAME__ INSTALLED__SUITE__
__REJECT_MESSAGE__
To: __MAINTAINER_TO__
__CC__
__BCC__
+MIME-Version: 1.0
+Content-Type: text/plain; charset="utf-8"
+Content-Transfer-Encoding: 8bit
Subject: __CHANGES_FILENAME__ UNACCEPT
__REJECT_MESSAGE__
From: Ben Collins <bxa@ftp-master.debian.org>
-To: crypt@bxa.doc.gov
-Cc: bxa@ftp-master.debian.org
+X-Not-Really-To: crypt@bis.doc.gov, enc@nsa.gov, web_site@bis.doc.gov
+To: bxa@ftp-master.debian.org
__BCC__
Precedence: junk
-Subject: Addition to __DISTRO__ Source Code
+MIME-Version: 1.0
+Content-Type: text/plain; charset="utf-8"
+Content-Transfer-Encoding: 8bit
+Subject: TSU Notification - Addition to __DISTRO__ Source Code
- Department of Commerce
- Bureau of Export Administration
- Office of Strategic Trade and Foreign Policy Controls
- 14th Street and Pennsylvania Ave., N.W.
- Room 2705
- Washington, DC 20230
+SUBMISSION TYPE: TSU
+SUBMITTED FOR: Software in the Public Interest (Debian)
+POINT OF CONTACT: Ben Collins
+PHONE and/or FAX: (804) 695-9730
+PRODUCT NAME/MODEL #: Debian Source Code
+ECCN: 5D002
-Re: Unrestricted Encryption Source Code Notification
+NOTIFICATION: http://ftp.debian.org/debian/
+
+Re: Unrestricted Encryption Source Code Notification
Commodity: Addition to Debian Source Code
+Attn: "TSU Notification"
+U.S. Department of Commerce
+Bureau of Industry and Security
+Office of National Security and Technology Transfer Controls (NSTTC)
+14th Street and Pennsylvania Avenue, NW
+Room 2705
+Washington, D.C. 20230
+Fax: (202) 219-9179
+
+Attn: ENC Encryption Request Coordinator
+9800 Savage Road, Suite 6940
+Ft. Meade, MD 20755-6000
+
Dear Sir/Madam,
Pursuant to paragraph (e)(1) of Part 740.13 of the U.S. Export
From: __FROM_ADDRESS__
To: __MAINTAINER_TO__
__CC__
+MIME-Version: 1.0
+Content-Type: text/plain; charset="utf-8"
+Content-Transfer-Encoding: 8bit
Subject: Comments regarding __CHANGES_FILENAME__
__PROD_MESSAGE__
To: __MAINTAINER_TO__
__BCC__
Precedence: bulk
+MIME-Version: 1.0
+Content-Type: text/plain; charset="utf-8"
+Content-Transfer-Encoding: 8bit
Subject: __CHANGES_FILENAME__ ACCEPTED__SUITE__
__REJECT_MESSAGE__
From: __MAINTAINER_FROM__
To: __ANNOUNCE_LIST_ADDRESS__
__BCC__
+MIME-Version: 1.0
+Content-Type: text/plain; charset="utf-8"
+Content-Transfer-Encoding: 8bit
Subject: Accepted __SOURCE__ __VERSION__ (__ARCHITECTURE__)
__FILE_CONTENTS__
From: __MAINTAINER_FROM__
To: __BUG_NUMBER__-close@__BUG_SERVER__
__BCC__
+MIME-Version: 1.0
+Content-Type: text/plain; charset="utf-8"
+Content-Transfer-Encoding: 8bit
Subject: Bug#__BUG_NUMBER__: fixed in __SOURCE__ __VERSION__
Source: __SOURCE__
To: control@__BUG_SERVER__
Cc: __MAINTAINER_TO__
__BCC__
+MIME-Version: 1.0
+Content-Type: text/plain; charset="utf-8"
+Content-Transfer-Encoding: 8bit
Subject: Fixed in upload of __SOURCE__ __VERSION__ to experimental
__CONTROL_MESSAGE__
To: control@__BUG_SERVER__
Cc: __MAINTAINER_TO__
__BCC__
+MIME-Version: 1.0
+Content-Type: text/plain; charset="utf-8"
+Content-Transfer-Encoding: 8bit
Subject: Fixed in NMU of __SOURCE__ __VERSION__
__CONTROL_MESSAGE__
To: __MAINTAINER_TO__
__BCC__
Precedence: bulk
+MIME-Version: 1.0
+Content-Type: text/plain; charset="utf-8"
+Content-Transfer-Encoding: 8bit
Subject: __CHANGES_FILENAME__ is NEW
__SUMMARY__
To: __MAINTAINER_TO__
__BCC__
Precedence: junk
+MIME-Version: 1.0
+Content-Type: text/plain; charset="utf-8"
+Content-Transfer-Encoding: 8bit
Subject: __SOURCE__ override disparity
There are disparities between your recently accepted upload and the
__BCC__
__CC__
Precedence: bulk
+MIME-Version: 1.0
+Content-Type: text/plain; charset="utf-8"
+Content-Transfer-Encoding: 8bit
Subject: __CHANGES_FILENAME__ REJECTED
__MANUAL_REJECT_MESSAGE__
__CC__
__BCC__
Precedence: bulk
+MIME-Version: 1.0
+Content-Type: text/plain; charset="utf-8"
+Content-Transfer-Encoding: 8bit
Subject: __CHANGES_FILENAME__ REJECTED from proposed-updates
Your package was rejected by an ftp master on behalf of
To: __BUG_NUMBER__-close@__BUG_SERVER__
__CC__
__BCC__
+MIME-Version: 1.0
+Content-Type: text/plain; charset="utf-8"
+Content-Transfer-Encoding: 8bit
Subject: Bug#__BUG_NUMBER__: fixed
We believe that the bug you reported is now fixed; the following
Subject: Template Advisory __ADVISORY__
------------------------------------------------------------------------
-Debian Security Advisory __ADVISORY__ security@debian.org
+Debian Security Advisory __ADVISORY__ security@debian.org
http://www.debian.org/security/ __WHOAMI__
-__DATE__
+__DATE__ http://www.debian.org/security/faq
------------------------------------------------------------------------
Package : __PACKAGE__
BugTraq ID : XXX
Debian Bug : XXX
-...
+Several local/remote vulnerabilities have been discovered in...
+The Common
+Vulnerabilities and Exposures project identifies the following problems:
[single issue]
-For the stable distribution (woody), this problem has been fixed in version XXX
+Foo discovered that
-For the old stable distribution (potato), this problem has been fixed in
-version XXX
+
+[single issue]
+For the stable distribution (etch), this problem has been fixed in version XXX
+__PACKAGE__
+
+For the old stable distribution (sarge), this problem has been fixed in
+version __PACKAGE__
[multiple issues]
-For the stable distribution (woody), these problems have been fixed in version
-XXX
+For the stable distribution (etch), these problems have been fixed in version
+__PACKAGE__
-For the old stable distribution (potato), these problems have been fixed in
-version XXX
+For the old stable distribution (sarge), these problems have been fixed in
+version __PACKAGE__
We recommend that you upgrade your __PACKAGE__ package.
You may use an automated update by adding the resources from the
footer to the proper configuration.
+
+Debian GNU/Linux 3.1 alias sarge
+--------------------------------
+
+Debian GNU/Linux 4.0 alias etch
+-------------------------------
+
+
__ADVISORY_TEXT__