Coverage for britney.py: 84%

749 statements  

« prev     ^ index     » next       coverage.py v6.5.0, created at 2024-04-18 20:48 +0000

1#!/usr/bin/python3 -u 

2# -*- coding: utf-8 -*- 

3 

4# Copyright (C) 2001-2008 Anthony Towns <ajt@debian.org> 

5# Andreas Barth <aba@debian.org> 

6# Fabio Tranchitella <kobold@debian.org> 

7# Copyright (C) 2010-2013 Adam D. Barratt <adsb@debian.org> 

8 

9# This program is free software; you can redistribute it and/or modify 

10# it under the terms of the GNU General Public License as published by 

11# the Free Software Foundation; either version 2 of the License, or 

12# (at your option) any later version. 

13 

14# This program is distributed in the hope that it will be useful, 

15# but WITHOUT ANY WARRANTY; without even the implied warranty of 

16# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the 

17# GNU General Public License for more details. 

18 

19""" 

20= Introduction = 

21 

22This is the Debian testing updater script, also known as "Britney". 

23 

24Packages are usually installed into the `testing' distribution after 

25they have undergone some degree of testing in unstable. The goal of 

26this software is to do this task in a smart way, allowing testing 

27to always be fully installable and close to being a release candidate. 

28 

29Britney's source code is split between two different but related tasks: 

30the first one is the generation of the update excuses, while the 

31second tries to update testing with the valid candidates; first 

32each package alone, then larger and even larger sets of packages 

33together. Each try is accepted if testing is not more uninstallable 

34after the update than before. 

35 

36= Data Loading = 

37 

38In order to analyze the entire Debian distribution, Britney needs to 

39load in memory the whole archive: this means more than 10.000 packages 

40for twelve architectures, as well as the dependency interconnections 

41between them. For this reason, the memory requirements for running this 

42software are quite high and at least 1 gigabyte of RAM should be available. 

43 

44Britney loads the source packages from the `Sources' file and the binary 

45packages from the `Packages_${arch}' files, where ${arch} is substituted 

46with the supported architectures. While loading the data, the software 

47analyzes the dependencies and builds a directed weighted graph in memory 

48with all the interconnections between the packages (see Britney.read_sources 

49and Britney.read_binaries). 

50 

51Other than source and binary packages, Britney loads the following data: 

52 

53 * BugsV, which contains the list of release-critical bugs for a given 

54 version of a source or binary package (see RCBugPolicy.read_bugs). 

55 

56 * Dates, which contains the date of the upload of a given version 

57 of a source package (see Britney.read_dates). 

58 

59 * Urgencies, which contains the urgency of the upload of a given 

60 version of a source package (see AgePolicy._read_urgencies). 

61 

62 * Hints, which contains lists of commands which modify the standard behaviour 

63 of Britney (see Britney.read_hints). 

64 

65 * Other policies typically require their own data. 

66 

67For a more detailed explanation about the format of these files, please read 

68the documentation of the related methods. The exact meaning of them will be 

69instead explained in the chapter "Excuses Generation". 

70 

71= Excuses = 

72 

73An excuse is a detailed explanation of why a package can or cannot 

74be updated in the testing distribution from a newer package in 

75another distribution (like for example unstable). The main purpose 

76of the excuses is to be written in an HTML file which will be 

77published over HTTP, as well as a YAML file. The maintainers will be able 

78to parse it manually or automatically to find the explanation of why their 

79packages have been updated or not. 

80 

81== Excuses generation == 

82 

83These are the steps (with references to method names) that Britney 

84does for the generation of the update excuses. 

85 

86 * If a source package is available in testing but it is not 

87 present in unstable and no binary packages in unstable are 

88 built from it, then it is marked for removal. 

89 

90 * Every source package in unstable and testing-proposed-updates, 

91 if already present in testing, is checked for binary-NMUs, new 

92 or dropped binary packages in all the supported architectures 

93 (see Britney.should_upgrade_srcarch). The steps to detect if an 

94 upgrade is needed are: 

95 

96 1. If there is a `remove' hint for the source package, the package 

97 is ignored: it will be removed and not updated. 

98 

99 2. For every binary package built from the new source, it checks 

100 for unsatisfied dependencies, new binary packages and updated 

101 binary packages (binNMU), excluding the architecture-independent 

102 ones, and packages not built from the same source. 

103 

104 3. For every binary package built from the old source, it checks 

105 if it is still built from the new source; if this is not true 

106 and the package is not architecture-independent, the script 

107 removes it from testing. 

108 

109 4. Finally, if there is something worth doing (eg. a new or updated 

110 binary package) and nothing wrong it marks the source package 

111 as "Valid candidate", or "Not considered" if there is something 

112 wrong which prevented the update. 

113 

114 * Every source package in unstable and testing-proposed-updates is 

115 checked for upgrade (see Britney.should_upgrade_src). The steps 

116 to detect if an upgrade is needed are: 

117 

118 1. If the source package in testing is more recent the new one 

119 is ignored. 

120 

121 2. If the source package doesn't exist (is fake), which means that 

122 a binary package refers to it but it is not present in the 

123 `Sources' file, the new one is ignored. 

124 

125 3. If the package doesn't exist in testing, the urgency of the 

126 upload is ignored and set to the default (actually `low'). 

127 

128 4. If there is a `remove' hint for the source package, the package 

129 is ignored: it will be removed and not updated. 

130 

131 5. If there is a `block' hint for the source package without an 

132 `unblock` hint or a `block-all source`, the package is ignored. 

133 

134 6. If there is a `block-udeb' hint for the source package, it will 

135 have the same effect as `block', but may only be cancelled by 

136 a subsequent `unblock-udeb' hint. 

137 

138 7. If the suite is unstable, the update can go ahead only if the 

139 upload happened more than the minimum days specified by the 

140 urgency of the upload; if this is not true, the package is 

141 ignored as `too-young'. Note that the urgency is sticky, meaning 

142 that the highest urgency uploaded since the previous testing 

143 transition is taken into account. 

144 

145 8. If the suite is unstable, all the architecture-dependent binary 

146 packages and the architecture-independent ones for the `nobreakall' 

147 architectures have to be built from the source we are considering. 

148 If this is not true, then these are called `out-of-date' 

149 architectures and the package is ignored. 

150 

151 9. The source package must have at least one binary package, otherwise 

152 it is ignored. 

153 

154 10. If the suite is unstable, the new source package must have no 

155 release critical bugs which do not also apply to the testing 

156 one. If this is not true, the package is ignored as `buggy'. 

157 

158 11. If there is a `force' hint for the source package, then it is 

159 updated even if it is marked as ignored from the previous steps. 

160 

161 12. If the suite is {testing-,}proposed-updates, the source package can 

162 be updated only if there is an explicit approval for it. Unless 

163 a `force' hint exists, the new package must also be available 

164 on all of the architectures for which it has binary packages in 

165 testing. 

166 

167 13. If the package will be ignored, mark it as "Valid candidate", 

168 otherwise mark it as "Not considered". 

169 

170 * The list of `remove' hints is processed: if the requested source 

171 package is not already being updated or removed and the version 

172 actually in testing is the same specified with the `remove' hint, 

173 it is marked for removal. 

174 

175 * The excuses are sorted by the number of days from the last upload 

176 (days-old) and by name. 

177 

178 * A list of unconsidered excuses (for which the package is not upgraded) 

179 is built. Using this list, all of the excuses depending on them are 

180 marked as invalid "impossible dependencies". 

181 

182 * The excuses are written in an HTML file. 

183""" 

184import contextlib 

185import logging 

186import optparse 

187import os 

188import sys 

189import time 

190from collections import defaultdict 

191from functools import reduce 

192from itertools import chain 

193from operator import attrgetter 

194 

195import apt_pkg 

196 

197from britney2 import SourcePackage, BinaryPackageId, BinaryPackage 

198from britney2.excusefinder import ExcuseFinder 

199from britney2.hints import HintParser 

200from britney2.inputs.suiteloader import DebMirrorLikeSuiteContentLoader, MissingRequiredConfigurationError 

201from britney2.installability.builder import build_installability_tester 

202from britney2.installability.solver import InstallabilitySolver 

203from britney2.migration import MigrationManager 

204from britney2.migrationitem import MigrationItemFactory 

205from britney2.policies.policy import (AgePolicy, 

206 RCBugPolicy, 

207 PiupartsPolicy, 

208 DependsPolicy, 

209 BuildDependsPolicy, 

210 PolicyEngine, 

211 BlockPolicy, 

212 BuiltUsingPolicy, 

213 BuiltOnBuilddPolicy, 

214 ImplicitDependencyPolicy, 

215 PolicyLoadRequest, 

216 ReproduciblePolicy, 

217 ReverseRemovalPolicy, 

218 ) 

219from britney2.policies.autopkgtest import AutopkgtestPolicy 

220from britney2.utils import (log_and_format_old_libraries, 

221 read_nuninst, write_nuninst, write_heidi, 

222 format_and_log_uninst, newly_uninst, 

223 write_excuses, write_heidi_delta, 

224 old_libraries, is_nuninst_asgood_generous, 

225 clone_nuninst, compile_nuninst, parse_provides, 

226 parse_option, 

227 MigrationConstraintException, 

228 ) 

229 

230__author__ = 'Fabio Tranchitella and the Debian Release Team' 

231__version__ = '2.0' 

232 

233 

234MIGRATION_POLICIES = [ 

235 PolicyLoadRequest.always_load(DependsPolicy), 

236 PolicyLoadRequest.conditionally_load(RCBugPolicy, 'rcbug_enable', True), 

237 PolicyLoadRequest.conditionally_load(PiupartsPolicy, 'piuparts_enable', True), 

238 PolicyLoadRequest.always_load(ImplicitDependencyPolicy), 

239 PolicyLoadRequest.conditionally_load(AutopkgtestPolicy, 'adt_enable', True), 

240 PolicyLoadRequest.conditionally_load(ReproduciblePolicy, 'repro_enable', False), 

241 PolicyLoadRequest.conditionally_load(AgePolicy, 'age_enable', True), 

242 PolicyLoadRequest.always_load(BuildDependsPolicy), 

243 PolicyLoadRequest.always_load(BlockPolicy), 

244 PolicyLoadRequest.conditionally_load(BuiltUsingPolicy, 'built_using_policy_enable', True), 

245 PolicyLoadRequest.conditionally_load(BuiltOnBuilddPolicy, 'check_buildd', False), 

246 PolicyLoadRequest.always_load(ReverseRemovalPolicy), 

247] 

248 

249 

250class Britney(object): 

251 """Britney, the Debian testing updater script 

252 

253 This is the script that updates the testing distribution. It is executed 

254 each day after the installation of the updated packages. It generates the 

255 `Packages' files for the testing distribution, but it does so in an 

256 intelligent manner; it tries to avoid any inconsistency and to use only 

257 non-buggy packages. 

258 

259 For more documentation on this script, please read the Developers Reference. 

260 """ 

261 

262 HINTS_HELPERS = ("easy", "hint", "remove", "block", "block-udeb", "unblock", "unblock-udeb", "approve", 

263 "remark", "ignore-piuparts", "ignore-rc-bugs", "force-skiptest", "force-badtest") 

264 HINTS_STANDARD = ("urgent", "age-days") + HINTS_HELPERS 

265 # ALL = {"force", "force-hint", "block-all"} | HINTS_STANDARD | registered policy hints (not covered above) 

266 HINTS_ALL = ('ALL') 

267 

268 def __init__(self): 

269 """Class constructor 

270 

271 This method initializes and populates the data lists, which contain all 

272 the information needed by the other methods of the class. 

273 """ 

274 

275 # setup logging - provide the "short level name" (i.e. INFO -> I) that 

276 # we used to use prior to using the logging module. 

277 

278 old_factory = logging.getLogRecordFactory() 

279 short_level_mapping = { 

280 'CRITICAL': 'F', 

281 'INFO': 'I', 

282 'WARNING': 'W', 

283 'ERROR': 'E', 

284 'DEBUG': 'N', 

285 } 

286 

287 def record_factory(*args, **kwargs): # pragma: no cover 

288 record = old_factory(*args, **kwargs) 

289 try: 

290 record.shortlevelname = short_level_mapping[record.levelname] 

291 except KeyError: 

292 record.shortlevelname = record.levelname 

293 return record 

294 

295 logging.setLogRecordFactory(record_factory) 

296 logging.basicConfig(format='{shortlevelname}: [{asctime}] - {message}', 

297 style='{', 

298 datefmt="%Y-%m-%dT%H:%M:%S%z", 

299 stream=sys.stdout, 

300 ) 

301 

302 self.logger = logging.getLogger() 

303 

304 # Logger for "upgrade_output"; the file handler will be attached later when 

305 # we are ready to open the file. 

306 self.output_logger = logging.getLogger('britney2.output.upgrade_output') 

307 self.output_logger.setLevel(logging.INFO) 

308 

309 # initialize the apt_pkg back-end 

310 apt_pkg.init() 

311 

312 # parse the command line arguments 

313 self._policy_engine = PolicyEngine() 

314 self.suite_info = None # Initialized during __parse_arguments 

315 self.__parse_arguments() 

316 assert self.suite_info is not None # for type checking 

317 

318 self.all_selected = [] 

319 self.excuses = {} 

320 self.upgrade_me = [] 

321 

322 if self.options.nuninst_cache: 322 ↛ 323line 322 didn't jump to line 323, because the condition on line 322 was never true

323 self.logger.info("Not building the list of non-installable packages, as requested") 

324 if self.options.print_uninst: 

325 nuninst = read_nuninst(self.options.noninst_status, 

326 self.options.architectures) 

327 print('* summary') 

328 print('\n'.join('%4d %s' % (len(nuninst[x]), x) for x in self.options.architectures)) 

329 return 

330 

331 try: 

332 constraints_file = os.path.join(self.options.static_input_dir, 'constraints') 

333 faux_packages = os.path.join(self.options.static_input_dir, 'faux-packages') 

334 except AttributeError: 

335 self.logger.info("The static_input_dir option is not set") 

336 constraints_file = None 

337 faux_packages = None 

338 if faux_packages is not None and os.path.exists(faux_packages): 

339 self.logger.info("Loading faux packages from %s", faux_packages) 

340 self._load_faux_packages(faux_packages) 

341 elif faux_packages is not None: 341 ↛ 344line 341 didn't jump to line 344, because the condition on line 341 was never false

342 self.logger.info("No Faux packages as %s does not exist", faux_packages) 

343 

344 if constraints_file is not None and os.path.exists(constraints_file): 

345 self.logger.info("Loading constraints from %s", constraints_file) 

346 self.constraints = self._load_constraints(constraints_file) 

347 else: 

348 if constraints_file is not None: 348 ↛ 350line 348 didn't jump to line 350

349 self.logger.info("No constraints as %s does not exist", constraints_file) 

350 self.constraints = { 

351 'keep-installable': [], 

352 } 

353 

354 self.logger.info("Compiling Installability tester") 

355 self.pkg_universe, self._inst_tester = build_installability_tester(self.suite_info, self.options.architectures) 

356 target_suite = self.suite_info.target_suite 

357 target_suite.inst_tester = self._inst_tester 

358 

359 self.allow_uninst = {} 

360 for arch in self.options.architectures: 

361 self.allow_uninst[arch] = set() 

362 self._migration_item_factory = MigrationItemFactory(self.suite_info) 

363 self._hint_parser = HintParser(self._migration_item_factory) 

364 self._migration_manager = MigrationManager(self.options, self.suite_info, self.all_binaries, self.pkg_universe, 

365 self.constraints, self.allow_uninst, self._migration_item_factory, 

366 self.hints) 

367 

368 if not self.options.nuninst_cache: 368 ↛ 395line 368 didn't jump to line 395, because the condition on line 368 was never false

369 self.logger.info("Building the list of non-installable packages for the full archive") 

370 self._inst_tester.compute_installability() 

371 nuninst = compile_nuninst(target_suite, 

372 self.options.architectures, 

373 self.options.nobreakall_arches) 

374 self.nuninst_orig = nuninst 

375 for arch in self.options.architectures: 

376 self.logger.info("> Found %d non-installable packages", len(nuninst[arch])) 

377 if self.options.print_uninst: 377 ↛ 378line 377 didn't jump to line 378, because the condition on line 377 was never true

378 self.nuninst_arch_report(nuninst, arch) 

379 

380 if self.options.print_uninst: 380 ↛ 381line 380 didn't jump to line 381, because the condition on line 380 was never true

381 print('* summary') 

382 print('\n'.join(map(lambda x: '%4d %s' % (len(nuninst[x]), x), self.options.architectures))) 

383 return 

384 else: 

385 write_nuninst(self.options.noninst_status, nuninst) 

386 

387 stats = self._inst_tester.compute_stats() 

388 self.logger.info("> Installability tester statistics (per architecture)") 

389 for arch in self.options.architectures: 

390 arch_stat = stats[arch] 

391 self.logger.info("> %s", arch) 

392 for stat in arch_stat.stat_summary(): 

393 self.logger.info("> - %s", stat) 

394 else: 

395 self.logger.info("Loading uninstallability counters from cache") 

396 self.nuninst_orig = read_nuninst(self.options.noninst_status, 

397 self.options.architectures) 

398 

399 # nuninst_orig may get updated during the upgrade process 

400 self.nuninst_orig_save = clone_nuninst(self.nuninst_orig, architectures=self.options.architectures) 

401 

402 self._policy_engine.register_policy_hints(self._hint_parser) 

403 

404 try: 

405 self.read_hints(self.options.hintsdir) 

406 except AttributeError: 

407 self.read_hints(os.path.join(self.suite_info['unstable'].path, 'Hints')) 

408 

409 self._policy_engine.initialise(self, self.hints) 

410 

411 def __parse_arguments(self): 

412 """Parse the command line arguments 

413 

414 This method parses and initializes the command line arguments. 

415 While doing so, it preprocesses some of the options to be converted 

416 in a suitable form for the other methods of the class. 

417 """ 

418 # initialize the parser 

419 parser = optparse.OptionParser(version="%prog") 

420 parser.add_option("-v", "", action="count", dest="verbose", help="enable verbose output") 

421 parser.add_option("-c", "--config", action="store", dest="config", default="/etc/britney.conf", 

422 help="path for the configuration file") 

423 parser.add_option("", "--architectures", action="store", dest="architectures", default=None, 

424 help="override architectures from configuration file") 

425 parser.add_option("", "--actions", action="store", dest="actions", default=None, 

426 help="override the list of actions to be performed") 

427 parser.add_option("", "--hints", action="store", dest="hints", default=None, 

428 help="additional hints, separated by semicolons") 

429 parser.add_option("", "--hint-tester", action="store_true", dest="hint_tester", default=None, 

430 help="provide a command line interface to test hints") 

431 parser.add_option("", "--dry-run", action="store_true", dest="dry_run", default=False, 

432 help="disable all outputs to the testing directory") 

433 parser.add_option("", "--nuninst-cache", action="store_true", dest="nuninst_cache", default=False, 

434 help="do not build the non-installability status, use the cache from file") 

435 parser.add_option("", "--print-uninst", action="store_true", dest="print_uninst", default=False, 

436 help="just print a summary of uninstallable packages") 

437 parser.add_option("", "--compute-migrations", action="store_true", dest="compute_migrations", default=True, 

438 help="Compute which packages can migrate (the default)") 

439 parser.add_option("", "--no-compute-migrations", action="store_false", dest="compute_migrations", 

440 help="Do not compute which packages can migrate.") 

441 parser.add_option("", "--series", action="store", dest="series", default='', 

442 help="set distribution series name") 

443 parser.add_option("", "--distribution", action="store", dest="distribution", default="debian", 

444 help="set distribution name") 

445 (self.options, self.args) = parser.parse_args() 

446 

447 if self.options.verbose: 447 ↛ 453line 447 didn't jump to line 453, because the condition on line 447 was never false

448 if self.options.verbose > 1: 448 ↛ 449line 448 didn't jump to line 449, because the condition on line 448 was never true

449 self.logger.setLevel(logging.DEBUG) 

450 else: 

451 self.logger.setLevel(logging.INFO) 

452 else: 

453 self.logger.setLevel(logging.WARNING) 

454 # Historical way to get debug information (equivalent to -vv) 

455 try: # pragma: no cover 

456 if int(os.environ.get('BRITNEY_DEBUG', '0')): 

457 self.logger.setLevel(logging.DEBUG) 

458 except ValueError: # pragma: no cover 

459 pass 

460 

461 # integrity checks 

462 if self.options.nuninst_cache and self.options.print_uninst: # pragma: no cover 

463 self.logger.error("nuninst_cache and print_uninst are mutually exclusive!") 

464 sys.exit(1) 

465 

466 # if the configuration file exists, then read it and set the additional options 

467 if not os.path.isfile(self.options.config): # pragma: no cover 

468 self.logger.error("Unable to read the configuration file (%s), exiting!", self.options.config) 

469 sys.exit(1) 

470 

471 self.HINTS = {'command-line': self.HINTS_ALL} 

472 with open(self.options.config, encoding='utf-8') as config: 

473 for line in config: 

474 if '=' in line and not line.strip().startswith('#'): 

475 k, v = line.split('=', 1) 

476 k = k.strip() 

477 v = v.strip() 

478 if k.startswith("HINTS_"): 

479 self.HINTS[k.split("_")[1].lower()] = \ 

480 reduce(lambda x, y: x+y, [ 

481 hasattr(self, "HINTS_" + i) and 

482 getattr(self, "HINTS_" + i) or 

483 (i,) for i in v.split()]) 

484 elif not hasattr(self.options, k.lower()) or \ 

485 not getattr(self.options, k.lower()): 

486 setattr(self.options, k.lower(), v) 

487 

488 parse_option(self.options, 'archall_inconsistency_allowed', to_bool=True) 

489 

490 suite_loader = DebMirrorLikeSuiteContentLoader(self.options) 

491 

492 try: 

493 self.suite_info = suite_loader.load_suites() 

494 except MissingRequiredConfigurationError as e: # pragma: no cover 

495 self.logger.error("Could not load the suite content due to missing configuration: %s", str(e)) 

496 sys.exit(1) 

497 self.all_binaries = suite_loader.all_binaries() 

498 self.options.components = suite_loader.components 

499 self.options.architectures = suite_loader.architectures 

500 self.options.nobreakall_arches = suite_loader.nobreakall_arches 

501 self.options.outofsync_arches = suite_loader.outofsync_arches 

502 self.options.break_arches = suite_loader.break_arches 

503 self.options.new_arches = suite_loader.new_arches 

504 if self.options.series == '': 504 ↛ 507line 504 didn't jump to line 507, because the condition on line 504 was never false

505 self.options.series = self.suite_info.target_suite.name 

506 

507 if self.options.heidi_output and not hasattr(self.options, "heidi_delta_output"): 507 ↛ 510line 507 didn't jump to line 510, because the condition on line 507 was never false

508 self.options.heidi_delta_output = self.options.heidi_output + "Delta" 

509 

510 self.options.smooth_updates = self.options.smooth_updates.split() 

511 

512 parse_option(self.options, 'ignore_cruft', to_bool=True) 

513 parse_option(self.options, 'check_consistency_level', default=2, to_int=True) 

514 parse_option(self.options, 'build_url') 

515 

516 self._policy_engine.load_policies(self.options, self.suite_info, MIGRATION_POLICIES) 

517 

518 @property 

519 def hints(self): 

520 return self._hint_parser.hints 

521 

522 def _load_faux_packages(self, faux_packages_file: str): 

523 """Loads fake packages 

524 

525 In rare cases, it is useful to create a "fake" package that can be used to satisfy 

526 dependencies. This is usually needed for packages that are not shipped directly 

527 on this mirror but is a prerequisite for using this mirror (e.g. some vendors provide 

528 non-distributable "setup" packages and contrib/non-free packages depend on these). 

529 

530 :param faux_packages_file: Path to the file containing the fake package definitions 

531 """ 

532 tag_file = apt_pkg.TagFile(faux_packages_file) 

533 get_field = tag_file.section.get 

534 step = tag_file.step 

535 no = 0 

536 pri_source_suite = self.suite_info.primary_source_suite 

537 target_suite = self.suite_info.target_suite 

538 

539 while step(): 

540 no += 1 

541 pkg_name = get_field('Package', None) 

542 if pkg_name is None: # pragma: no cover 

543 raise ValueError("Missing Package field in paragraph %d (file %s)" % (no, faux_packages_file)) 

544 pkg_name = sys.intern(pkg_name) 

545 version = sys.intern(get_field('Version', '1.0-1')) 

546 provides_raw = get_field('Provides') 

547 archs_raw = get_field('Architecture', None) 

548 component = get_field('Component', 'non-free') 

549 if archs_raw: 549 ↛ 550line 549 didn't jump to line 550, because the condition on line 549 was never true

550 archs = archs_raw.split() 

551 else: 

552 archs = self.options.architectures 

553 faux_section = 'faux' 

554 if component != 'main': 554 ↛ 556line 554 didn't jump to line 556, because the condition on line 554 was never false

555 faux_section = "%s/faux" % component 

556 src_data = SourcePackage(pkg_name, 

557 version, 

558 sys.intern(faux_section), 

559 set(), 

560 None, 

561 True, 

562 None, 

563 None, 

564 [], 

565 [], 

566 ) 

567 

568 target_suite.sources[pkg_name] = src_data 

569 pri_source_suite.sources[pkg_name] = src_data 

570 

571 for arch in archs: 

572 pkg_id = BinaryPackageId(pkg_name, version, arch) 

573 if provides_raw: 573 ↛ 574line 573 didn't jump to line 574, because the condition on line 573 was never true

574 provides = parse_provides(provides_raw, pkg_id=pkg_id, logger=self.logger) 

575 else: 

576 provides = [] 

577 bin_data = BinaryPackage(version, 

578 faux_section, 

579 pkg_name, 

580 version, 

581 arch, 

582 get_field('Multi-Arch'), 

583 None, 

584 None, 

585 provides, 

586 False, 

587 pkg_id, 

588 [], 

589 ) 

590 

591 src_data.binaries.add(pkg_id) 

592 target_suite.binaries[arch][pkg_name] = bin_data 

593 pri_source_suite.binaries[arch][pkg_name] = bin_data 

594 

595 # register provided packages with the target suite provides table 

596 for provided_pkg, provided_version, _ in bin_data.provides: 596 ↛ 597line 596 didn't jump to line 597, because the loop on line 596 never started

597 target_suite.provides_table[arch][provided_pkg].add((pkg_name, provided_version)) 

598 

599 self.all_binaries[pkg_id] = bin_data 

600 

601 def _load_constraints(self, constraints_file): 

602 """Loads configurable constraints 

603 

604 The constraints file can contain extra rules that Britney should attempt 

605 to satisfy. Examples can be "keep package X in testing and ensure it is 

606 installable". 

607 

608 :param constraints_file: Path to the file containing the constraints 

609 """ 

610 tag_file = apt_pkg.TagFile(constraints_file) 

611 get_field = tag_file.section.get 

612 step = tag_file.step 

613 no = 0 

614 faux_version = sys.intern('1') 

615 faux_section = sys.intern('faux') 

616 keep_installable = [] 

617 constraints = { 

618 'keep-installable': keep_installable 

619 } 

620 pri_source_suite = self.suite_info.primary_source_suite 

621 target_suite = self.suite_info.target_suite 

622 

623 while step(): 

624 no += 1 

625 pkg_name = get_field('Fake-Package-Name', None) 

626 if pkg_name is None: # pragma: no cover 

627 raise ValueError("Missing Fake-Package-Name field in paragraph %d (file %s)" % (no, constraints_file)) 

628 pkg_name = sys.intern(pkg_name) 

629 

630 def mandatory_field(x): 

631 v = get_field(x, None) 

632 if v is None: # pragma: no cover 

633 raise ValueError("Missing %s field for %s (file %s)" % (x, pkg_name, constraints_file)) 

634 return v 

635 

636 constraint = mandatory_field('Constraint') 

637 if constraint not in {'present-and-installable'}: # pragma: no cover 

638 raise ValueError("Unsupported constraint %s for %s (file %s)" % (constraint, pkg_name, constraints_file)) 

639 

640 self.logger.info(" - constraint %s", pkg_name) 

641 

642 pkg_list = [x.strip() for x in mandatory_field('Package-List').split("\n") 

643 if x.strip() != '' and not x.strip().startswith("#")] 

644 src_data = SourcePackage(pkg_name, 

645 faux_version, 

646 faux_section, 

647 set(), 

648 None, 

649 True, 

650 None, 

651 None, 

652 [], 

653 [], 

654 ) 

655 target_suite.sources[pkg_name] = src_data 

656 pri_source_suite.sources[pkg_name] = src_data 

657 keep_installable.append(pkg_name) 

658 for arch in self.options.architectures: 

659 deps = [] 

660 for pkg_spec in pkg_list: 

661 s = pkg_spec.split(None, 1) 

662 if len(s) == 1: 

663 deps.append(s[0]) 

664 else: 

665 pkg, arch_res = s 

666 if not (arch_res.startswith('[') and arch_res.endswith(']')): # pragma: no cover 

667 raise ValueError("Invalid arch-restriction on %s - should be [arch1 arch2] (for %s file %s)" 

668 % (pkg, pkg_name, constraints_file)) 

669 arch_res = arch_res[1:-1].split() 

670 if not arch_res: # pragma: no cover 

671 msg = "Empty arch-restriction for %s: Uses comma or negation (for %s file %s)" 

672 raise ValueError(msg % (pkg, pkg_name, constraints_file)) 

673 for a in arch_res: 

674 if a == arch: 

675 deps.append(pkg) 

676 elif ',' in a or '!' in a: # pragma: no cover 

677 msg = "Invalid arch-restriction for %s: Uses comma or negation (for %s file %s)" 

678 raise ValueError(msg % (pkg, pkg_name, constraints_file)) 

679 pkg_id = BinaryPackageId(pkg_name, faux_version, arch) 

680 bin_data = BinaryPackage(faux_version, 

681 faux_section, 

682 pkg_name, 

683 faux_version, 

684 arch, 

685 'no', 

686 ', '.join(deps), 

687 None, 

688 [], 

689 False, 

690 pkg_id, 

691 [], 

692 ) 

693 src_data.binaries.add(pkg_id) 

694 target_suite.binaries[arch][pkg_name] = bin_data 

695 pri_source_suite.binaries[arch][pkg_name] = bin_data 

696 self.all_binaries[pkg_id] = bin_data 

697 

698 return constraints 

699 

700 # Data reading/writing methods 

701 # ---------------------------- 

702 

703 def read_hints(self, hintsdir): 

704 """Read the hint commands from the specified directory 

705 

706 The hint commands are read from the files contained in the directory 

707 specified by the `hintsdir' parameter. 

708 The names of the files have to be the same as the authorized users 

709 for the hints. 

710 

711 The file contains rows with the format: 

712 

713 <command> <package-name>[/<version>] 

714 

715 The method returns a dictionary where the key is the command, and 

716 the value is the list of affected packages. 

717 """ 

718 

719 for who in self.HINTS.keys(): 

720 if who == 'command-line': 

721 lines = self.options.hints and self.options.hints.split(';') or () 

722 filename = '<cmd-line>' 

723 self._hint_parser.parse_hints(who, self.HINTS[who], filename, lines) 

724 else: 

725 filename = os.path.join(hintsdir, who) 

726 if not os.path.isfile(filename): 726 ↛ 727line 726 didn't jump to line 727, because the condition on line 726 was never true

727 self.logger.error("Cannot read hints list from %s, no such file!", filename) 

728 continue 

729 self.logger.info("Loading hints list from %s", filename) 

730 with open(filename, encoding='utf-8') as f: 

731 self._hint_parser.parse_hints(who, self.HINTS[who], filename, f) 

732 

733 hints = self._hint_parser.hints 

734 

735 for x in ["block", "block-all", "block-udeb", "unblock", "unblock-udeb", "force", "urgent", "remove", "age-days"]: 

736 z = defaultdict(dict) 

737 for hint in hints[x]: 

738 package = hint.package 

739 architecture = hint.architecture 

740 key = (hint, hint.user) 

741 if package in z and architecture in z[package] and z[package][architecture] != key: 

742 hint2 = z[package][architecture][0] 

743 if x in ['unblock', 'unblock-udeb']: 

744 if apt_pkg.version_compare(hint2.version, hint.version) < 0: 

745 # This hint is for a newer version, so discard the old one 

746 self.logger.warning("Overriding %s[%s] = ('%s', '%s', '%s') with ('%s', '%s', '%s')", 

747 x, package, hint2.version, hint2.architecture, 

748 hint2.user, hint.version, hint.architecture, hint.user) 

749 hint2.set_active(False) 

750 else: 

751 # This hint is for an older version, so ignore it in favour of the new one 

752 self.logger.warning("Ignoring %s[%s] = ('%s', '%s', '%s'), ('%s', '%s', '%s') is higher or equal", 

753 x, package, hint.version, hint.architecture, hint.user, 

754 hint2.version, hint2.architecture, hint2.user) 

755 hint.set_active(False) 

756 else: 

757 self.logger.warning("Overriding %s[%s] = ('%s', '%s') with ('%s', '%s')", 

758 x, package, hint2.user, hint2, hint.user, hint) 

759 hint2.set_active(False) 

760 

761 z[package][architecture] = key 

762 

763 for hint in hints['allow-uninst']: 

764 if hint.architecture == 'source': 

765 for arch in self.options.architectures: 

766 self.allow_uninst[arch].add(hint.package) 

767 else: 

768 self.allow_uninst[hint.architecture].add(hint.package) 

769 

770 # Sanity check the hints hash 

771 if len(hints["block"]) == 0 and len(hints["block-udeb"]) == 0: 

772 self.logger.warning("WARNING: No block hints at all, not even udeb ones!") 

773 

774 def write_excuses(self): 

775 """Produce and write the update excuses 

776 

777 This method handles the update excuses generation: the packages are 

778 looked at to determine whether they are valid candidates. For the details 

779 of this procedure, please refer to the module docstring. 

780 """ 

781 

782 self.logger.info("Update Excuses generation started") 

783 

784 mi_factory = self._migration_item_factory 

785 excusefinder = ExcuseFinder(self.options, self.suite_info, self.all_binaries, 

786 self.pkg_universe, self._policy_engine, mi_factory, self.hints) 

787 

788 excuses, upgrade_me = excusefinder.find_actionable_excuses() 

789 self.excuses = excuses 

790 

791 # sort the list of candidates 

792 self.upgrade_me = sorted(upgrade_me) 

793 old_lib_removals = old_libraries(mi_factory, self.suite_info, self.options.outofsync_arches) 

794 self.upgrade_me.extend(old_lib_removals) 

795 self.output_logger.info("List of old libraries added to upgrade_me (%d):", len(old_lib_removals)) 

796 log_and_format_old_libraries(self.output_logger, old_lib_removals) 

797 

798 # write excuses to the output file 

799 if not self.options.dry_run: 799 ↛ 808line 799 didn't jump to line 808, because the condition on line 799 was never false

800 self.logger.info("> Writing Excuses to %s", self.options.excuses_output) 

801 write_excuses(excuses, self.options.excuses_output, 

802 output_format="legacy-html") 

803 if hasattr(self.options, 'excuses_yaml_output'): 803 ↛ 808line 803 didn't jump to line 808, because the condition on line 803 was never false

804 self.logger.info("> Writing YAML Excuses to %s", self.options.excuses_yaml_output) 

805 write_excuses(excuses, self.options.excuses_yaml_output, 

806 output_format="yaml") 

807 

808 self.logger.info("Update Excuses generation completed") 

809 

810 # Upgrade run 

811 # ----------- 

812 

813 def eval_nuninst(self, nuninst, original=None): 

814 """Return a string which represents the uninstallability counters 

815 

816 This method returns a string which represents the uninstallability 

817 counters reading the uninstallability statistics `nuninst` and, if 

818 present, merging the results with the `original` one. 

819 

820 An example of the output string is: 

821 1+2: i-0:a-0:a-0:h-0:i-1:m-0:m-0:p-0:a-0:m-0:s-2:s-0 

822 

823 where the first part is the number of broken packages in non-break 

824 architectures + the total number of broken packages for all the 

825 architectures. 

826 """ 

827 res = [] 

828 total = 0 

829 totalbreak = 0 

830 for arch in self.options.architectures: 

831 if arch in nuninst: 831 ↛ 833line 831 didn't jump to line 833, because the condition on line 831 was never false

832 n = len(nuninst[arch]) 

833 elif original and arch in original: 

834 n = len(original[arch]) 

835 else: 

836 continue 

837 if arch in self.options.break_arches: 

838 totalbreak = totalbreak + n 

839 else: 

840 total = total + n 

841 res.append("%s-%d" % (arch[0], n)) 

842 return "%d+%d: %s" % (total, totalbreak, ":".join(res)) 

843 

844 def iter_packages(self, packages, selected, nuninst=None): 

845 """Iter on the list of actions and apply them one-by-one 

846 

847 This method applies the changes from `packages` to testing, checking the uninstallability 

848 counters for every action performed. If the action does not improve them, it is reverted. 

849 The method returns the new uninstallability counters and the remaining actions if the 

850 final result is successful, otherwise (None, []). 

851 

852 :param packages: list of MigrationItem 

853 :param selected: list of MigrationItem? 

854 :param nuninst: dict with sets ? of ? per architecture 

855 """ 

856 assert self.suite_info is not None # for type checking 

857 group_info = {} 

858 rescheduled_packages = packages 

859 maybe_rescheduled_packages = [] 

860 output_logger = self.output_logger 

861 solver = InstallabilitySolver(self.pkg_universe, self._inst_tester) 

862 mm = self._migration_manager 

863 target_suite = self.suite_info.target_suite 

864 

865 for y in sorted((y for y in packages), key=attrgetter('uvname')): 

866 try: 

867 _, updates, rms, _ = mm.compute_groups(y) 

868 result = (y, sorted(updates), sorted(rms)) 

869 group_info[y] = result 

870 except MigrationConstraintException as e: 

871 rescheduled_packages.remove(y) 

872 output_logger.info("not adding package to list: %s", (y.package)) 

873 output_logger.info(" got exception: %s" % (repr(e))) 

874 

875 if nuninst: 

876 nuninst_orig = nuninst 

877 else: 

878 nuninst_orig = self.nuninst_orig 

879 

880 nuninst_last_accepted = nuninst_orig 

881 

882 output_logger.info("recur: [] %s %d/0", ",".join(x.uvname for x in selected), len(packages)) 

883 while rescheduled_packages: 

884 groups = [group_info[x] for x in rescheduled_packages] 

885 worklist = solver.solve_groups(groups) 

886 rescheduled_packages = [] 

887 

888 worklist.reverse() 

889 

890 while worklist: 

891 comp = worklist.pop() 

892 comp_name = ' '.join(item.uvname for item in comp) 

893 output_logger.info("trying: %s" % comp_name) 

894 with mm.start_transaction() as transaction: 

895 accepted = False 

896 try: 

897 accepted, nuninst_after, failed_arch, new_cruft = mm.migrate_items_to_target_suite( 

898 comp, 

899 nuninst_last_accepted 

900 ) 

901 if accepted: 

902 selected.extend(comp) 

903 transaction.commit() 

904 output_logger.info("accepted: %s", comp_name) 

905 output_logger.info(" ori: %s", self.eval_nuninst(nuninst_orig)) 

906 output_logger.info(" pre: %s", self.eval_nuninst(nuninst_last_accepted)) 

907 output_logger.info(" now: %s", self.eval_nuninst(nuninst_after)) 

908 if new_cruft: 

909 output_logger.info( 

910 " added new cruft items to list: %s", 

911 " ".join(x.uvname for x in sorted(new_cruft))) 

912 

913 if len(selected) <= 20: 

914 output_logger.info(" all: %s", " ".join(x.uvname for x in selected)) 

915 else: 

916 output_logger.info(" most: (%d) .. %s", 

917 len(selected), 

918 " ".join(x.uvname for x in selected[-20:])) 

919 if self.options.check_consistency_level >= 3: 

920 target_suite.check_suite_source_pkg_consistency('iter_packages after commit') 

921 nuninst_last_accepted = nuninst_after 

922 for cruft_item in new_cruft: 

923 try: 

924 _, updates, rms, _ = mm.compute_groups(cruft_item) 

925 result = (cruft_item, sorted(updates), sorted(rms)) 

926 group_info[cruft_item] = result 

927 worklist.append([cruft_item]) 

928 except MigrationConstraintException as e: 

929 output_logger.info( 

930 " got exception adding cruft item %s to list: %s" % 

931 (cruft_item.uvname, repr(e))) 

932 rescheduled_packages.extend(maybe_rescheduled_packages) 

933 maybe_rescheduled_packages.clear() 

934 else: 

935 transaction.rollback() 

936 broken = sorted(b for b in nuninst_after[failed_arch] 

937 if b not in nuninst_last_accepted[failed_arch]) 

938 compare_nuninst = None 

939 if any(item for item in comp if item.architecture != 'source'): 

940 compare_nuninst = nuninst_last_accepted 

941 # NB: try_migration already reverted this for us, so just print the results and move on 

942 output_logger.info("skipped: %s (%d, %d, %d)", 

943 comp_name, 

944 len(rescheduled_packages), 

945 len(maybe_rescheduled_packages), 

946 len(worklist) 

947 ) 

948 output_logger.info(" got: %s", self.eval_nuninst(nuninst_after, compare_nuninst)) 

949 output_logger.info(" * %s: %s", failed_arch, ", ".join(broken)) 

950 if self.options.check_consistency_level >= 3: 

951 target_suite.check_suite_source_pkg_consistency('iter_package after rollback (not accepted)') 

952 

953 except MigrationConstraintException as e: 

954 transaction.rollback() 

955 output_logger.info("skipped: %s (%d, %d, %d)", 

956 comp_name, 

957 len(rescheduled_packages), 

958 len(maybe_rescheduled_packages), 

959 len(worklist) 

960 ) 

961 output_logger.info(" got exception: %s" % (repr(e))) 

962 if self.options.check_consistency_level >= 3: 962 ↛ 966line 962 didn't jump to line 966, because the condition on line 962 was never false

963 target_suite.check_suite_source_pkg_consistency( 

964 'iter_package after rollback (MigrationConstraintException)') 

965 

966 if not accepted: 

967 if len(comp) > 1: 

968 output_logger.info(" - splitting the component into single items and retrying them") 

969 worklist.extend([item] for item in comp) 

970 else: 

971 maybe_rescheduled_packages.append(comp[0]) 

972 

973 output_logger.info(" finish: [%s]", ",".join(x.uvname for x in selected)) 

974 output_logger.info("endloop: %s", self.eval_nuninst(self.nuninst_orig)) 

975 output_logger.info(" now: %s", self.eval_nuninst(nuninst_last_accepted)) 

976 format_and_log_uninst(output_logger, 

977 self.options.architectures, 

978 newly_uninst(self.nuninst_orig, nuninst_last_accepted) 

979 ) 

980 output_logger.info("") 

981 

982 return (nuninst_last_accepted, maybe_rescheduled_packages) 

983 

984 def do_all(self, hinttype=None, init=None, actions=None): 

985 """Testing update runner 

986 

987 This method tries to update testing checking the uninstallability 

988 counters before and after the actions to decide if the update was 

989 successful or not. 

990 """ 

991 selected = [] 

992 if actions: 

993 upgrade_me = actions[:] 

994 else: 

995 upgrade_me = self.upgrade_me[:] 

996 nuninst_start = self.nuninst_orig 

997 output_logger = self.output_logger 

998 target_suite = self.suite_info.target_suite 

999 

1000 # these are special parameters for hints processing 

1001 force = False 

1002 recurse = True 

1003 nuninst_end = None 

1004 extra = [] 

1005 mm = self._migration_manager 

1006 

1007 if hinttype == "easy" or hinttype == "force-hint": 

1008 force = hinttype == "force-hint" 

1009 recurse = False 

1010 

1011 # if we have a list of initial packages, check them 

1012 if init: 

1013 for x in init: 

1014 if x not in upgrade_me: 

1015 output_logger.warning("failed: %s is not a valid candidate (or it already migrated)", x.uvname) 

1016 return None 

1017 selected.append(x) 

1018 upgrade_me.remove(x) 

1019 

1020 output_logger.info("start: %s", self.eval_nuninst(nuninst_start)) 

1021 output_logger.info("orig: %s", self.eval_nuninst(nuninst_start)) 

1022 

1023 if init and not force: 

1024 # We will need to be able to roll back (e.g. easy or a "hint"-hint) 

1025 _start_transaction = mm.start_transaction 

1026 else: 

1027 # No "outer" transaction needed as we will never need to rollback 

1028 # (e.g. "force-hint" or a regular "main run"). Emulate the start_transaction 

1029 # call from the MigrationManager, so the rest of the code follows the 

1030 # same flow regardless of whether we need the transaction or not. 

1031 

1032 @contextlib.contextmanager 

1033 def _start_transaction(): 

1034 yield None 

1035 

1036 with _start_transaction() as transaction: 

1037 

1038 if init: 

1039 # init => a hint (e.g. "easy") - so do the hint run 

1040 (_, nuninst_end, _, new_cruft) = mm.migrate_items_to_target_suite(selected, 

1041 self.nuninst_orig, 

1042 stop_on_first_regression=False) 

1043 

1044 if recurse: 

1045 # Ensure upgrade_me and selected do not overlap, if we 

1046 # follow-up with a recurse ("hint"-hint). 

1047 upgrade_me = [x for x in upgrade_me if x not in set(selected)] 

1048 else: 

1049 # On non-recursive hints check for cruft and purge it proactively in case it "fixes" the hint. 

1050 cruft = [x for x in upgrade_me if x.is_cruft_removal] 

1051 if new_cruft: 

1052 output_logger.info( 

1053 "Change added new cruft items to list: %s", 

1054 " ".join(x.uvname for x in sorted(new_cruft))) 

1055 cruft.extend(new_cruft) 

1056 if cruft: 

1057 output_logger.info("Checking if changes enables cruft removal") 

1058 (nuninst_end, remaining_cruft) = self.iter_packages(cruft, 

1059 selected, 

1060 nuninst=nuninst_end) 

1061 output_logger.info("Removed %d of %d cruft item(s) after the changes", 

1062 len(cruft) - len(remaining_cruft), len(cruft)) 

1063 new_cruft.difference_update(remaining_cruft) 

1064 

1065 # Add new cruft items regardless of whether we recurse. A future run might clean 

1066 # them for us. 

1067 upgrade_me.extend(new_cruft) 

1068 

1069 if recurse: 

1070 # Either the main run or the recursive run of a "hint"-hint. 

1071 (nuninst_end, extra) = self.iter_packages(upgrade_me, 

1072 selected, 

1073 nuninst=nuninst_end) 

1074 

1075 nuninst_end_str = self.eval_nuninst(nuninst_end) 

1076 

1077 if not recurse: 

1078 # easy or force-hint 

1079 output_logger.info("easy: %s", nuninst_end_str) 

1080 

1081 if not force: 

1082 format_and_log_uninst(self.output_logger, 

1083 self.options.architectures, 

1084 newly_uninst(nuninst_start, nuninst_end) 

1085 ) 

1086 

1087 if force: 

1088 # Force implies "unconditionally better" 

1089 better = True 

1090 else: 

1091 break_arches = set(self.options.break_arches) 

1092 if all(x.architecture in break_arches for x in selected): 

1093 # If we only migrated items from break-arches, then we 

1094 # do not allow any regressions on these architectures. 

1095 # This usually only happens with hints 

1096 break_arches = set() 

1097 better = is_nuninst_asgood_generous(self.constraints, 

1098 self.allow_uninst, 

1099 self.options.architectures, 

1100 self.nuninst_orig, 

1101 nuninst_end, 

1102 break_arches) 

1103 

1104 if better: 

1105 # Result accepted either by force or by being better than the original result. 

1106 output_logger.info("final: %s", ",".join(sorted(x.uvname for x in selected))) 

1107 output_logger.info("start: %s", self.eval_nuninst(nuninst_start)) 

1108 output_logger.info(" orig: %s", self.eval_nuninst(self.nuninst_orig)) 

1109 output_logger.info(" end: %s", nuninst_end_str) 

1110 if force: 

1111 broken = newly_uninst(nuninst_start, nuninst_end) 

1112 if broken: 

1113 output_logger.warning("force breaks:") 

1114 format_and_log_uninst(self.output_logger, 

1115 self.options.architectures, 

1116 broken, 

1117 loglevel=logging.WARNING, 

1118 ) 

1119 else: 

1120 output_logger.info("force did not break any packages") 

1121 output_logger.info("SUCCESS (%d/%d)", len(actions or self.upgrade_me), len(extra)) 

1122 self.nuninst_orig = nuninst_end 

1123 self.all_selected += selected 

1124 if transaction: 

1125 transaction.commit() 

1126 if self.options.check_consistency_level >= 2: 1126 ↛ 1128line 1126 didn't jump to line 1128, because the condition on line 1126 was never false

1127 target_suite.check_suite_source_pkg_consistency('do_all after commit') 

1128 if not actions: 

1129 if recurse: 

1130 self.upgrade_me = extra 

1131 else: 

1132 self.upgrade_me = [x for x in self.upgrade_me if x not in set(selected)] 

1133 else: 

1134 output_logger.info("FAILED\n") 

1135 if not transaction: 1135 ↛ 1139line 1135 didn't jump to line 1139, because the condition on line 1135 was never true

1136 # if we 'FAILED', but we cannot rollback, we will probably 

1137 # leave a broken state behind 

1138 # this should not happen 

1139 raise AssertionError("do_all FAILED but no transaction to rollback") 

1140 transaction.rollback() 

1141 if self.options.check_consistency_level >= 2: 1141 ↛ 1036line 1141 didn't jump to line 1036

1142 target_suite.check_suite_source_pkg_consistency('do_all after rollback') 

1143 

1144 output_logger.info("") 

1145 

1146 def assert_nuninst_is_correct(self): 

1147 self.logger.info("> Update complete - Verifying non-installability counters") 

1148 

1149 cached_nuninst = self.nuninst_orig 

1150 self._inst_tester.compute_installability() 

1151 computed_nuninst = compile_nuninst(self.suite_info.target_suite, 

1152 self.options.architectures, 

1153 self.options.nobreakall_arches) 

1154 if cached_nuninst != computed_nuninst: # pragma: no cover 

1155 only_on_break_archs = True 

1156 self.logger.error("==================== NUNINST OUT OF SYNC =========================") 

1157 for arch in self.options.architectures: 

1158 expected_nuninst = set(cached_nuninst[arch]) 

1159 actual_nuninst = set(computed_nuninst[arch]) 

1160 false_negatives = actual_nuninst - expected_nuninst 

1161 false_positives = expected_nuninst - actual_nuninst 

1162 # Britney does not quite work correctly with 

1163 # break/fucked arches, so ignore issues there for now. 

1164 if (false_negatives or false_positives) and arch not in self.options.break_arches: 

1165 only_on_break_archs = False 

1166 if false_negatives: 

1167 self.logger.error(" %s - unnoticed nuninst: %s", arch, str(false_negatives)) 

1168 if false_positives: 

1169 self.logger.error(" %s - invalid nuninst: %s", arch, str(false_positives)) 

1170 self.logger.info(" %s - actual nuninst: %s", arch, str(sorted(actual_nuninst))) 

1171 self.logger.error("==================== NUNINST OUT OF SYNC =========================") 

1172 if not only_on_break_archs: 

1173 raise AssertionError("NUNINST OUT OF SYNC") 

1174 else: 

1175 self.logger.warning("Nuninst is out of sync on some break arches") 

1176 

1177 self.logger.info("> All non-installability counters are ok") 

1178 

1179 def upgrade_testing(self): 

1180 """Upgrade testing using the packages from the source suites 

1181 

1182 This method tries to upgrade testing using the packages from the 

1183 source suites. 

1184 Before running the do_all method, it tries the easy and force-hint 

1185 commands. 

1186 """ 

1187 

1188 output_logger = self.output_logger 

1189 self.logger.info("Starting the upgrade test") 

1190 output_logger.info("Generated on: %s", time.strftime("%Y.%m.%d %H:%M:%S %z", time.gmtime(time.time()))) 

1191 output_logger.info("Arch order is: %s", ", ".join(self.options.architectures)) 

1192 

1193 if not self.options.actions: 1193 ↛ 1204line 1193 didn't jump to line 1204, because the condition on line 1193 was never false

1194 # process `easy' hints 

1195 for x in self.hints['easy']: 

1196 self.do_hint("easy", x.user, x.packages) 

1197 

1198 # process `force-hint' hints 

1199 for x in self.hints["force-hint"]: 

1200 self.do_hint("force-hint", x.user, x.packages) 

1201 

1202 # run the first round of the upgrade 

1203 # - do separate runs for break arches 

1204 allpackages = [] 

1205 normpackages = self.upgrade_me[:] 

1206 archpackages = {} 

1207 for a in self.options.break_arches: 

1208 archpackages[a] = [p for p in normpackages if p.architecture == a] 

1209 normpackages = [p for p in normpackages if p not in archpackages[a]] 

1210 self.upgrade_me = normpackages 

1211 output_logger.info("info: main run") 

1212 self.do_all() 

1213 allpackages += self.upgrade_me 

1214 for a in self.options.break_arches: 

1215 backup = self.options.break_arches 

1216 self.options.break_arches = " ".join(x for x in self.options.break_arches if x != a) 

1217 self.upgrade_me = archpackages[a] 

1218 output_logger.info("info: broken arch run for %s", a) 

1219 self.do_all() 

1220 allpackages += self.upgrade_me 

1221 self.options.break_arches = backup 

1222 self.upgrade_me = allpackages 

1223 

1224 if self.options.actions: 1224 ↛ 1225line 1224 didn't jump to line 1225, because the condition on line 1224 was never true

1225 self.printuninstchange() 

1226 return 

1227 

1228 # process `hint' hints 

1229 hintcnt = 0 

1230 for x in self.hints["hint"][:50]: 

1231 if hintcnt > 50: 1231 ↛ 1232line 1231 didn't jump to line 1232, because the condition on line 1231 was never true

1232 output_logger.info("Skipping remaining hints...") 

1233 break 

1234 if self.do_hint("hint", x.user, x.packages): 1234 ↛ 1230line 1234 didn't jump to line 1230, because the condition on line 1234 was never false

1235 hintcnt += 1 

1236 

1237 # run the auto hinter 

1238 self.run_auto_hinter() 

1239 

1240 if getattr(self.options, "remove_obsolete", "yes") == "yes": 

1241 # obsolete source packages 

1242 # a package is obsolete if none of the binary packages in testing 

1243 # are built by it 

1244 self.logger.info("> Removing obsolete source packages from the target suite") 

1245 # local copies for performance 

1246 target_suite = self.suite_info.target_suite 

1247 sources_t = target_suite.sources 

1248 binaries_t = target_suite.binaries 

1249 mi_factory = self._migration_item_factory 

1250 used = set(binaries_t[arch][binary].source 

1251 for arch in binaries_t 

1252 for binary in binaries_t[arch] 

1253 ) 

1254 removals = [mi_factory.parse_item("-%s/%s" % (source, sources_t[source].version), auto_correct=False) 

1255 for source in sources_t if source not in used 

1256 ] 

1257 if removals: 

1258 output_logger.info("Removing obsolete source packages from the target suite (%d):", len(removals)) 

1259 self.do_all(actions=removals) 

1260 

1261 # smooth updates 

1262 removals = old_libraries(self._migration_item_factory, self.suite_info, self.options.outofsync_arches) 

1263 if removals: 

1264 output_logger.info("Removing packages left in the target suite (e.g. smooth updates or cruft)") 

1265 log_and_format_old_libraries(self.output_logger, removals) 

1266 self.do_all(actions=removals) 

1267 removals = old_libraries(self._migration_item_factory, self.suite_info, self.options.outofsync_arches) 

1268 

1269 output_logger.info("List of old libraries in the target suite (%d):", len(removals)) 

1270 log_and_format_old_libraries(self.output_logger, removals) 

1271 

1272 self.printuninstchange() 

1273 if self.options.check_consistency_level >= 1: 1273 ↛ 1279line 1273 didn't jump to line 1279, because the condition on line 1273 was never false

1274 target_suite = self.suite_info.target_suite 

1275 self.assert_nuninst_is_correct() 

1276 target_suite.check_suite_source_pkg_consistency('end') 

1277 

1278 # output files 

1279 if self.options.heidi_output and not self.options.dry_run: 1279 ↛ 1292line 1279 didn't jump to line 1292, because the condition on line 1279 was never false

1280 target_suite = self.suite_info.target_suite 

1281 

1282 # write HeidiResult 

1283 self.logger.info("Writing Heidi results to %s", self.options.heidi_output) 

1284 write_heidi(self.options.heidi_output, 

1285 target_suite, 

1286 outofsync_arches=self.options.outofsync_arches) 

1287 

1288 self.logger.info("Writing delta to %s", self.options.heidi_delta_output) 

1289 write_heidi_delta(self.options.heidi_delta_output, 

1290 self.all_selected) 

1291 

1292 self.logger.info("Test completed!") 

1293 

1294 def printuninstchange(self): 

1295 self.logger.info("Checking for newly uninstallable packages") 

1296 uninst = newly_uninst(self.nuninst_orig_save, self.nuninst_orig) 

1297 

1298 if uninst: 

1299 self.output_logger.warning("") 

1300 self.output_logger.warning("Newly uninstallable packages in the target suite:") 

1301 format_and_log_uninst(self.output_logger, 

1302 self.options.architectures, 

1303 uninst, 

1304 loglevel=logging.WARNING, 

1305 ) 

1306 

1307 def hint_tester(self): 

1308 """Run a command line interface to test hints 

1309 

1310 This method provides a command line interface for the release team to 

1311 try hints and evaluate the results. 

1312 """ 

1313 import readline 

1314 from britney2.completer import Completer 

1315 

1316 histfile = os.path.expanduser('~/.britney2_history') 

1317 if os.path.exists(histfile): 

1318 readline.read_history_file(histfile) 

1319 

1320 readline.parse_and_bind('tab: complete') 

1321 readline.set_completer(Completer(self).completer) 

1322 # Package names can contain "-" and we use "/" in our presentation of them as well, 

1323 # so ensure readline does not split on these characters. 

1324 readline.set_completer_delims(readline.get_completer_delims().replace('-', '').replace('/', '')) 

1325 

1326 known_hints = self._hint_parser.registered_hints 

1327 

1328 print("Britney hint tester") 

1329 print() 

1330 print("Besides inputting known britney hints, the follow commands are also available") 

1331 print(" * quit/exit - terminates the shell") 

1332 print(" * python-console - jump into an interactive python shell (with the current loaded dataset)") 

1333 print() 

1334 

1335 while True: 

1336 # read the command from the command line 

1337 try: 

1338 user_input = input('britney> ').split() 

1339 except EOFError: 

1340 print("") 

1341 break 

1342 except KeyboardInterrupt: 

1343 print("") 

1344 continue 

1345 # quit the hint tester 

1346 if user_input and user_input[0] in ('quit', 'exit'): 

1347 break 

1348 elif user_input and user_input[0] == 'python-console': 

1349 try: 

1350 import britney2.console 

1351 except ImportError as e: 

1352 print("Failed to import britney.console module: %s" % repr(e)) 

1353 continue 

1354 britney2.console.run_python_console(self) 

1355 print("Returning to the britney hint-tester console") 

1356 # run a hint 

1357 elif user_input and user_input[0] in ('easy', 'hint', 'force-hint'): 

1358 mi_factory = self._migration_item_factory 

1359 try: 

1360 self.do_hint(user_input[0], 'hint-tester', mi_factory.parse_items(user_input[1:])) 

1361 self.printuninstchange() 

1362 except KeyboardInterrupt: 

1363 continue 

1364 elif user_input and user_input[0] in known_hints: 

1365 self._hint_parser.parse_hints('hint-tester', self.HINTS_ALL, '<stdin>', [' '.join(user_input)]) 

1366 self.write_excuses() 

1367 

1368 try: 

1369 readline.write_history_file(histfile) 

1370 except IOError as e: 

1371 self.logger.warning("Could not write %s: %s", histfile, e) 

1372 

1373 def do_hint(self, hinttype, who, pkgvers): 

1374 """Process hints 

1375 

1376 This method process `easy`, `hint` and `force-hint` hints. If the 

1377 requested version is not in the relevant source suite, then the hint 

1378 is skipped. 

1379 """ 

1380 

1381 output_logger = self.output_logger 

1382 

1383 self.logger.info("> Processing '%s' hint from %s", hinttype, who) 

1384 output_logger.info("Trying %s from %s: %s", hinttype, who, 

1385 " ".join("%s/%s" % (x.uvname, x.version) for x in pkgvers) 

1386 ) 

1387 

1388 issues = [] 

1389 # loop on the requested packages and versions 

1390 for idx in range(len(pkgvers)): 

1391 pkg = pkgvers[idx] 

1392 # skip removal requests 

1393 if pkg.is_removal: 

1394 continue 

1395 

1396 suite = pkg.suite 

1397 

1398 if pkg.package not in suite.sources: 1398 ↛ 1399line 1398 didn't jump to line 1399, because the condition on line 1398 was never true

1399 issues.append("Source %s has no version in %s" % (pkg.package, suite.name)) 

1400 elif apt_pkg.version_compare(suite.sources[pkg.package].version, pkg.version) != 0: 1400 ↛ 1401line 1400 didn't jump to line 1401, because the condition on line 1400 was never true

1401 issues.append("Version mismatch, %s %s != %s" % (pkg.package, pkg.version, 

1402 suite.sources[pkg.package].version)) 

1403 if issues: 1403 ↛ 1404line 1403 didn't jump to line 1404, because the condition on line 1403 was never true

1404 output_logger.warning("%s: Not using hint", ", ".join(issues)) 

1405 return False 

1406 

1407 self.do_all(hinttype, pkgvers) 

1408 return True 

1409 

1410 def get_auto_hinter_hints(self, upgrade_me): 

1411 """Auto-generate "easy" hints. 

1412 

1413 This method attempts to generate "easy" hints for sets of packages which 

1414 must migrate together. Beginning with a package which does not depend on 

1415 any other package (in terms of excuses), a list of dependencies and 

1416 reverse dependencies is recursively created. 

1417 

1418 Once all such lists have been generated, any which are subsets of other 

1419 lists are ignored in favour of the larger lists. The remaining lists are 

1420 then attempted in turn as "easy" hints. 

1421 

1422 We also try to auto hint circular dependencies analyzing the update 

1423 excuses relationships. If they build a circular dependency, which we already 

1424 know as not-working with the standard do_all algorithm, try to `easy` them. 

1425 """ 

1426 self.logger.info("> Processing hints from the auto hinter") 

1427 

1428 sources_t = self.suite_info.target_suite.sources 

1429 excuses = self.excuses 

1430 

1431 def excuse_still_valid(excuse): 

1432 source = excuse.source 

1433 arch = excuse.item.architecture 

1434 # TODO for binNMUs, this check is always ok, even if the item 

1435 # migrated already 

1436 valid = (arch != 'source' or 

1437 source not in sources_t or 

1438 sources_t[source].version != excuse.ver[1]) 

1439 # TODO migrated items should be removed from upgrade_me, so this 

1440 # should not happen 

1441 if not valid: 1441 ↛ 1442line 1441 didn't jump to line 1442, because the condition on line 1441 was never true

1442 raise AssertionError("excuse no longer valid %s" % (item)) 

1443 return valid 

1444 

1445 # consider only excuses which are valid candidates and still relevant. 

1446 valid_excuses = frozenset(e.name for n, e in excuses.items() 

1447 if e.item in upgrade_me 

1448 and excuse_still_valid(e)) 

1449 excuses_deps = {name: valid_excuses.intersection(excuse.get_deps()) 

1450 for name, excuse in excuses.items() if name in valid_excuses} 

1451 excuses_rdeps = defaultdict(set) 

1452 for name, deps in excuses_deps.items(): 

1453 for dep in deps: 

1454 excuses_rdeps[dep].add(name) 

1455 

1456 # loop on them 

1457 candidates = [] 

1458 mincands = [] 

1459 seen_hints = set() 

1460 for e in valid_excuses: 

1461 excuse = excuses[e] 

1462 if not excuse.get_deps(): 

1463 items = [excuse.item] 

1464 orig_size = 1 

1465 looped = False 

1466 seen_items = set() 

1467 seen_items.update(items) 

1468 

1469 for item in items: 

1470 # excuses which depend on "item" or are depended on by it 

1471 new_items = {excuses[x].item for x in chain(excuses_deps[item.name], excuses_rdeps[item.name])} 

1472 new_items -= seen_items 

1473 items.extend(new_items) 

1474 seen_items.update(new_items) 

1475 

1476 if not looped and len(items) > 1: 

1477 orig_size = len(items) 

1478 h = frozenset(seen_items) 

1479 if h not in seen_hints: 1479 ↛ 1482line 1479 didn't jump to line 1482, because the condition on line 1479 was never false

1480 mincands.append(h) 

1481 seen_hints.add(h) 

1482 looped = True 

1483 if len(items) != orig_size: 1483 ↛ 1484line 1483 didn't jump to line 1484, because the condition on line 1483 was never true

1484 h = frozenset(seen_items) 

1485 if h != mincands[-1] and h not in seen_hints: 

1486 candidates.append(h) 

1487 seen_hints.add(h) 

1488 return [candidates, mincands] 

1489 

1490 def run_auto_hinter(self): 

1491 for lst in self.get_auto_hinter_hints(self.upgrade_me): 

1492 for hint in lst: 

1493 self.do_hint("easy", "autohinter", sorted(hint)) 

1494 

1495 def nuninst_arch_report(self, nuninst, arch): 

1496 """Print a report of uninstallable packages for one architecture.""" 

1497 all = defaultdict(set) 

1498 binaries_t = self.suite_info.target_suite.binaries 

1499 for p in nuninst[arch]: 

1500 pkg = binaries_t[arch][p] 

1501 all[(pkg.source, pkg.source_version)].add(p) 

1502 

1503 print('* %s' % arch) 

1504 

1505 for (src, ver), pkgs in sorted(all.items()): 

1506 print(' %s (%s): %s' % (src, ver, ' '.join(sorted(pkgs)))) 

1507 

1508 print() 

1509 

1510 def _remove_archall_faux_packages(self): 

1511 """Remove faux packages added for the excuses phase 

1512 

1513 To prevent binary packages from going missing while they are listed by 

1514 their source package we add bin:faux packages during reading in the 

1515 Sources. They are used during the excuses phase to prevent packages 

1516 from becoming candidates. However, they interfere in complex ways 

1517 during the installability phase, so instead of having all code during 

1518 migration be aware of this excuses phase implementation detail, let's 

1519 remove them again. 

1520 

1521 """ 

1522 if not self.options.archall_inconsistency_allowed: 

1523 all_binaries = self.all_binaries 

1524 faux = {x for x in all_binaries.keys() if x[2] == 'faux'} 

1525 for pkg in faux: 

1526 del all_binaries[pkg] 

1527 

1528 for suite in self.suite_info._suites.values(): 

1529 for arch in suite.binaries.keys(): 

1530 binaries = suite.binaries[arch] 

1531 faux = {x for x in binaries if binaries[x].pkg_id[2] == 'faux'} 

1532 for pkg in faux: 

1533 del binaries[pkg] 

1534 sources = suite.sources 

1535 for src in sources.keys(): 

1536 faux = {x for x in sources[src].binaries if x[2] == 'faux'} 

1537 sources[src].binaries -= faux 

1538 

1539 def main(self): 

1540 """Main method 

1541 

1542 This is the entry point for the class: it includes the list of calls 

1543 for the member methods which will produce the output files. 

1544 """ 

1545 # if running in --print-uninst mode, quit 

1546 if self.options.print_uninst: 1546 ↛ 1547line 1546 didn't jump to line 1547, because the condition on line 1546 was never true

1547 return 

1548 # if no actions are provided, build the excuses and sort them 

1549 elif not self.options.actions: 1549 ↛ 1553line 1549 didn't jump to line 1553, because the condition on line 1549 was never false

1550 self.write_excuses() 

1551 # otherwise, use the actions provided by the command line 

1552 else: 

1553 self.upgrade_me = self.options.actions.split() 

1554 

1555 self._remove_archall_faux_packages() 

1556 

1557 if self.options.compute_migrations or self.options.hint_tester: 

1558 if self.options.dry_run: 1558 ↛ 1559line 1558 didn't jump to line 1559, because the condition on line 1558 was never true

1559 self.logger.info("Upgrade output not (also) written to a separate file" 

1560 " as this is a dry-run.") 

1561 elif hasattr(self.options, 'upgrade_output'): 1561 ↛ 1569line 1561 didn't jump to line 1569, because the condition on line 1561 was never false

1562 upgrade_output = getattr(self.options, 'upgrade_output') 

1563 file_handler = logging.FileHandler(upgrade_output, mode='w', encoding='utf-8') 

1564 output_formatter = logging.Formatter('%(message)s') 

1565 file_handler.setFormatter(output_formatter) 

1566 self.output_logger.addHandler(file_handler) 

1567 self.logger.info("Logging upgrade output to %s", upgrade_output) 

1568 else: 

1569 self.logger.info("Upgrade output not (also) written to a separate file" 

1570 " as the UPGRADE_OUTPUT configuration is not provided.") 

1571 

1572 # run the hint tester 

1573 if self.options.hint_tester: 1573 ↛ 1574line 1573 didn't jump to line 1574, because the condition on line 1573 was never true

1574 self.hint_tester() 

1575 # run the upgrade test 

1576 else: 

1577 self.upgrade_testing() 

1578 

1579 self.logger.info('> Stats from the installability tester') 

1580 for stat in self._inst_tester.stats.stats(): 

1581 self.logger.info('> %s', stat) 

1582 else: 

1583 self.logger.info('Migration computation skipped as requested.') 

1584 if not self.options.dry_run: 1584 ↛ 1586line 1584 didn't jump to line 1586, because the condition on line 1584 was never false

1585 self._policy_engine.save_state(self) 

1586 logging.shutdown() 

1587 

1588 

1589if __name__ == '__main__': 

1590 Britney().main()