Coverage for britney2/britney.py: 84%

767 statements  

« prev     ^ index     » next       coverage.py v6.5.0, created at 2025-03-23 07:34 +0000

1#!/usr/bin/python3 -u 

2# -*- coding: utf-8 -*- 

3 

4# Copyright (C) 2001-2008 Anthony Towns <ajt@debian.org> 

5# Andreas Barth <aba@debian.org> 

6# Fabio Tranchitella <kobold@debian.org> 

7# Copyright (C) 2010-2013 Adam D. Barratt <adsb@debian.org> 

8 

9# This program is free software; you can redistribute it and/or modify 

10# it under the terms of the GNU General Public License as published by 

11# the Free Software Foundation; either version 2 of the License, or 

12# (at your option) any later version. 

13 

14# This program is distributed in the hope that it will be useful, 

15# but WITHOUT ANY WARRANTY; without even the implied warranty of 

16# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the 

17# GNU General Public License for more details. 

18 

19""" 

20= Introduction = 

21 

22This is the Debian testing updater script, also known as "Britney". 

23 

24Packages are usually installed into the `testing' distribution after 

25they have undergone some degree of testing in unstable. The goal of 

26this software is to do this task in a smart way, allowing testing 

27to always be fully installable and close to being a release candidate. 

28 

29Britney's source code is split between two different but related tasks: 

30the first one is the generation of the update excuses, while the 

31second tries to update testing with the valid candidates; first 

32each package alone, then larger and even larger sets of packages 

33together. Each try is accepted if testing is not more uninstallable 

34after the update than before. 

35 

36= Data Loading = 

37 

38In order to analyze the entire Debian distribution, Britney needs to 

39load in memory the whole archive: this means more than 10.000 packages 

40for twelve architectures, as well as the dependency interconnections 

41between them. For this reason, the memory requirements for running this 

42software are quite high and at least 1 gigabyte of RAM should be available. 

43 

44Britney loads the source packages from the `Sources' file and the binary 

45packages from the `Packages_${arch}' files, where ${arch} is substituted 

46with the supported architectures. While loading the data, the software 

47analyzes the dependencies and builds a directed weighted graph in memory 

48with all the interconnections between the packages (see Britney.read_sources 

49and Britney.read_binaries). 

50 

51Other than source and binary packages, Britney loads the following data: 

52 

53 * BugsV, which contains the list of release-critical bugs for a given 

54 version of a source or binary package (see RCBugPolicy.read_bugs). 

55 

56 * Dates, which contains the date of the upload of a given version 

57 of a source package (see Britney.read_dates). 

58 

59 * Urgencies, which contains the urgency of the upload of a given 

60 version of a source package (see AgePolicy._read_urgencies). 

61 

62 * Hints, which contains lists of commands which modify the standard behaviour 

63 of Britney (see Britney.read_hints). 

64 

65 * Other policies typically require their own data. 

66 

67For a more detailed explanation about the format of these files, please read 

68the documentation of the related methods. The exact meaning of them will be 

69instead explained in the chapter "Excuses Generation". 

70 

71= Excuses = 

72 

73An excuse is a detailed explanation of why a package can or cannot 

74be updated in the testing distribution from a newer package in 

75another distribution (like for example unstable). The main purpose 

76of the excuses is to be written in an HTML file which will be 

77published over HTTP, as well as a YAML file. The maintainers will be able 

78to parse it manually or automatically to find the explanation of why their 

79packages have been updated or not. 

80 

81== Excuses generation == 

82 

83These are the steps (with references to method names) that Britney 

84does for the generation of the update excuses. 

85 

86 * If a source package is available in testing but it is not 

87 present in unstable and no binary packages in unstable are 

88 built from it, then it is marked for removal. 

89 

90 * Every source package in unstable and testing-proposed-updates, 

91 if already present in testing, is checked for binary-NMUs, new 

92 or dropped binary packages in all the supported architectures 

93 (see Britney.should_upgrade_srcarch). The steps to detect if an 

94 upgrade is needed are: 

95 

96 1. If there is a `remove' hint for the source package, the package 

97 is ignored: it will be removed and not updated. 

98 

99 2. For every binary package built from the new source, it checks 

100 for unsatisfied dependencies, new binary packages and updated 

101 binary packages (binNMU), excluding the architecture-independent 

102 ones, and packages not built from the same source. 

103 

104 3. For every binary package built from the old source, it checks 

105 if it is still built from the new source; if this is not true 

106 and the package is not architecture-independent, the script 

107 removes it from testing. 

108 

109 4. Finally, if there is something worth doing (eg. a new or updated 

110 binary package) and nothing wrong it marks the source package 

111 as "Valid candidate", or "Not considered" if there is something 

112 wrong which prevented the update. 

113 

114 * Every source package in unstable and testing-proposed-updates is 

115 checked for upgrade (see Britney.should_upgrade_src). The steps 

116 to detect if an upgrade is needed are: 

117 

118 1. If the source package in testing is more recent the new one 

119 is ignored. 

120 

121 2. If the source package doesn't exist (is fake), which means that 

122 a binary package refers to it but it is not present in the 

123 `Sources' file, the new one is ignored. 

124 

125 3. If the package doesn't exist in testing, the urgency of the 

126 upload is ignored and set to the default (actually `low'). 

127 

128 4. If there is a `remove' hint for the source package, the package 

129 is ignored: it will be removed and not updated. 

130 

131 5. If there is a `block' hint for the source package without an 

132 `unblock` hint or a `block-all source`, the package is ignored. 

133 

134 6. If there is a `block-udeb' hint for the source package, it will 

135 have the same effect as `block', but may only be cancelled by 

136 a subsequent `unblock-udeb' hint. 

137 

138 7. If the suite is unstable, the update can go ahead only if the 

139 upload happened more than the minimum days specified by the 

140 urgency of the upload; if this is not true, the package is 

141 ignored as `too-young'. Note that the urgency is sticky, meaning 

142 that the highest urgency uploaded since the previous testing 

143 transition is taken into account. 

144 

145 8. If the suite is unstable, all the architecture-dependent binary 

146 packages and the architecture-independent ones for the `nobreakall' 

147 architectures have to be built from the source we are considering. 

148 If this is not true, then these are called `out-of-date' 

149 architectures and the package is ignored. 

150 

151 9. The source package must have at least one binary package, otherwise 

152 it is ignored. 

153 

154 10. If the suite is unstable, the new source package must have no 

155 release critical bugs which do not also apply to the testing 

156 one. If this is not true, the package is ignored as `buggy'. 

157 

158 11. If there is a `force' hint for the source package, then it is 

159 updated even if it is marked as ignored from the previous steps. 

160 

161 12. If the suite is {testing-,}proposed-updates, the source package can 

162 be updated only if there is an explicit approval for it. Unless 

163 a `force' hint exists, the new package must also be available 

164 on all of the architectures for which it has binary packages in 

165 testing. 

166 

167 13. If the package will be ignored, mark it as "Valid candidate", 

168 otherwise mark it as "Not considered". 

169 

170 * The list of `remove' hints is processed: if the requested source 

171 package is not already being updated or removed and the version 

172 actually in testing is the same specified with the `remove' hint, 

173 it is marked for removal. 

174 

175 * The excuses are sorted by the number of days from the last upload 

176 (days-old) and by name. 

177 

178 * A list of unconsidered excuses (for which the package is not upgraded) 

179 is built. Using this list, all of the excuses depending on them are 

180 marked as invalid "impossible dependencies". 

181 

182 * The excuses are written in an HTML file. 

183""" 

184import contextlib 

185import logging 

186import optparse 

187import os 

188import sys 

189import time 

190from collections import defaultdict 

191from functools import reduce 

192from itertools import chain 

193from operator import attrgetter 

194from typing import TYPE_CHECKING, Any, Optional, cast 

195from collections.abc import Iterator 

196 

197import apt_pkg 

198 

199from britney2 import BinaryPackage, BinaryPackageId, SourcePackage, Suites 

200from britney2.excusefinder import ExcuseFinder 

201from britney2.hints import Hint, HintCollection, HintParser 

202from britney2.inputs.suiteloader import ( 

203 DebMirrorLikeSuiteContentLoader, 

204 MissingRequiredConfigurationError, 

205) 

206from britney2.installability.builder import build_installability_tester 

207from britney2.installability.solver import InstallabilitySolver 

208from britney2.migration import MigrationManager 

209from britney2.migrationitem import MigrationItem, MigrationItemFactory 

210from britney2.policies.autopkgtest import AutopkgtestPolicy 

211from britney2.policies.policy import ( 

212 AgePolicy, 

213 BlockPolicy, 

214 BuildDependsPolicy, 

215 BuiltOnBuilddPolicy, 

216 BuiltUsingPolicy, 

217 DependsPolicy, 

218 ImplicitDependencyPolicy, 

219 PiupartsPolicy, 

220 PolicyEngine, 

221 PolicyLoadRequest, 

222 RCBugPolicy, 

223 ReproduciblePolicy, 

224 ReverseRemovalPolicy, 

225) 

226from britney2.utils import ( 

227 MigrationConstraintException, 

228 clone_nuninst, 

229 compile_nuninst, 

230 format_and_log_uninst, 

231 is_nuninst_asgood_generous, 

232 log_and_format_old_libraries, 

233 newly_uninst, 

234 old_libraries, 

235 parse_option, 

236 parse_provides, 

237 read_nuninst, 

238 write_excuses, 

239 write_heidi, 

240 write_heidi_delta, 

241 write_nuninst, 

242) 

243 

244if TYPE_CHECKING: 244 ↛ 245line 244 didn't jump to line 245, because the condition on line 244 was never true

245 from .excuse import Excuse 

246 from .installability.tester import InstallabilityTester 

247 from .installability.universe import BinaryPackageUniverse 

248 from .transaction import MigrationTransactionState 

249 

250 

251__author__ = "Fabio Tranchitella and the Debian Release Team" 

252__version__ = "2.0" 

253 

254 

255MIGRATION_POLICIES = [ 

256 PolicyLoadRequest.always_load(DependsPolicy), 

257 PolicyLoadRequest.conditionally_load(RCBugPolicy, "rcbug_enable", True), 

258 PolicyLoadRequest.conditionally_load(PiupartsPolicy, "piuparts_enable", True), 

259 PolicyLoadRequest.always_load(ImplicitDependencyPolicy), 

260 PolicyLoadRequest.conditionally_load(AutopkgtestPolicy, "adt_enable", True), 

261 PolicyLoadRequest.conditionally_load(ReproduciblePolicy, "repro_enable", False), 

262 PolicyLoadRequest.conditionally_load(AgePolicy, "age_enable", True), 

263 PolicyLoadRequest.always_load(BuildDependsPolicy), 

264 PolicyLoadRequest.always_load(BlockPolicy), 

265 PolicyLoadRequest.conditionally_load( 

266 BuiltUsingPolicy, "built_using_policy_enable", True 

267 ), 

268 PolicyLoadRequest.conditionally_load(BuiltOnBuilddPolicy, "check_buildd", False), 

269 PolicyLoadRequest.always_load(ReverseRemovalPolicy), 

270] 

271 

272 

273class Britney(object): 

274 """Britney, the Debian testing updater script 

275 

276 This is the script that updates the testing distribution. It is executed 

277 each day after the installation of the updated packages. It generates the 

278 `Packages' files for the testing distribution, but it does so in an 

279 intelligent manner; it tries to avoid any inconsistency and to use only 

280 non-buggy packages. 

281 

282 For more documentation on this script, please read the Developers Reference. 

283 """ 

284 

285 HINTS_HELPERS = ( 

286 "easy", 

287 "hint", 

288 "remove", 

289 "block", 

290 "block-udeb", 

291 "unblock", 

292 "unblock-udeb", 

293 "approve", 

294 "remark", 

295 "ignore-piuparts", 

296 "ignore-rc-bugs", 

297 "force-skiptest", 

298 "force-badtest", 

299 ) 

300 HINTS_STANDARD = ("urgent", "age-days") + HINTS_HELPERS 

301 # ALL = {"force", "force-hint", "block-all"} | HINTS_STANDARD | registered policy hints (not covered above) 

302 HINTS_ALL = "ALL" 

303 pkg_universe: "BinaryPackageUniverse" 

304 _inst_tester: "InstallabilityTester" 

305 constraints: dict[str, list[str]] 

306 suite_info: Suites 

307 

308 def __init__(self) -> None: 

309 """Class constructor 

310 

311 This method initializes and populates the data lists, which contain all 

312 the information needed by the other methods of the class. 

313 """ 

314 

315 # setup logging - provide the "short level name" (i.e. INFO -> I) that 

316 # we used to use prior to using the logging module. 

317 

318 old_factory = logging.getLogRecordFactory() 

319 short_level_mapping = { 

320 "CRITICAL": "F", 

321 "INFO": "I", 

322 "WARNING": "W", 

323 "ERROR": "E", 

324 "DEBUG": "N", 

325 } 

326 

327 def record_factory( 

328 *args: Any, **kwargs: Any 

329 ) -> logging.LogRecord: # pragma: no cover 

330 record = old_factory(*args, **kwargs) 

331 try: 

332 record.shortlevelname = short_level_mapping[record.levelname] 

333 except KeyError: 

334 record.shortlevelname = record.levelname 

335 return record 

336 

337 logging.setLogRecordFactory(record_factory) 

338 logging.basicConfig( 

339 format="{shortlevelname}: [{asctime}] - {message}", 

340 style="{", 

341 datefmt="%Y-%m-%dT%H:%M:%S%z", 

342 stream=sys.stdout, 

343 ) 

344 

345 self.logger = logging.getLogger() 

346 

347 # Logger for "upgrade_output"; the file handler will be attached later when 

348 # we are ready to open the file. 

349 self.output_logger = logging.getLogger("britney2.output.upgrade_output") 

350 self.output_logger.setLevel(logging.INFO) 

351 

352 # initialize the apt_pkg back-end 

353 apt_pkg.init() 

354 

355 # parse the command line arguments 

356 self._policy_engine = PolicyEngine() 

357 self.__parse_arguments() 

358 assert self.suite_info is not None # for type checking 

359 

360 self.all_selected: list[MigrationItem] = [] 

361 self.excuses: dict[str, "Excuse"] = {} 

362 self.upgrade_me: list[MigrationItem] = [] 

363 

364 if self.options.nuninst_cache: 364 ↛ 365line 364 didn't jump to line 365, because the condition on line 364 was never true

365 self.logger.info( 

366 "Not building the list of non-installable packages, as requested" 

367 ) 

368 if self.options.print_uninst: 

369 nuninst = read_nuninst( 

370 self.options.noninst_status, self.options.architectures 

371 ) 

372 print("* summary") 

373 print( 

374 "\n".join( 

375 "%4d %s" % (len(nuninst[x]), x) 

376 for x in self.options.architectures 

377 ) 

378 ) 

379 return 

380 

381 try: 

382 constraints_file = os.path.join( 

383 self.options.static_input_dir, "constraints" 

384 ) 

385 faux_packages = os.path.join(self.options.static_input_dir, "faux-packages") 

386 except AttributeError: 

387 self.logger.info("The static_input_dir option is not set") 

388 constraints_file = None 

389 faux_packages = None 

390 if faux_packages is not None and os.path.exists(faux_packages): 

391 self.logger.info("Loading faux packages from %s", faux_packages) 

392 self._load_faux_packages(faux_packages) 

393 elif faux_packages is not None: 393 ↛ 396line 393 didn't jump to line 396, because the condition on line 393 was never false

394 self.logger.info("No Faux packages as %s does not exist", faux_packages) 

395 

396 if constraints_file is not None and os.path.exists(constraints_file): 

397 self.logger.info("Loading constraints from %s", constraints_file) 

398 self.constraints = self._load_constraints(constraints_file) 

399 else: 

400 if constraints_file is not None: 400 ↛ 404line 400 didn't jump to line 404

401 self.logger.info( 

402 "No constraints as %s does not exist", constraints_file 

403 ) 

404 self.constraints = { 

405 "keep-installable": [], 

406 } 

407 

408 self.logger.info("Compiling Installability tester") 

409 self.pkg_universe, self._inst_tester = build_installability_tester( 

410 self.suite_info, self.options.architectures 

411 ) 

412 target_suite = self.suite_info.target_suite 

413 target_suite.inst_tester = self._inst_tester 

414 

415 self.allow_uninst: dict[str, set[Optional[str]]] = {} 

416 for arch in self.options.architectures: 

417 self.allow_uninst[arch] = set() 

418 self._migration_item_factory: MigrationItemFactory = MigrationItemFactory( 

419 self.suite_info 

420 ) 

421 self._hint_parser: HintParser = HintParser(self._migration_item_factory) 

422 self._migration_manager: MigrationManager = MigrationManager( 

423 self.options, 

424 self.suite_info, 

425 self.all_binaries, 

426 self.pkg_universe, 

427 self.constraints, 

428 self.allow_uninst, 

429 self._migration_item_factory, 

430 self.hints, 

431 ) 

432 

433 if not self.options.nuninst_cache: 433 ↛ 473line 433 didn't jump to line 473, because the condition on line 433 was never false

434 self.logger.info( 

435 "Building the list of non-installable packages for the full archive" 

436 ) 

437 self._inst_tester.compute_installability() 

438 nuninst = compile_nuninst( 

439 target_suite, self.options.architectures, self.options.nobreakall_arches 

440 ) 

441 self.nuninst_orig: dict[str, set[str]] = nuninst 

442 for arch in self.options.architectures: 

443 self.logger.info( 

444 "> Found %d non-installable packages for %s", 

445 len(nuninst[arch]), 

446 arch, 

447 ) 

448 if self.options.print_uninst: 448 ↛ 449line 448 didn't jump to line 449, because the condition on line 448 was never true

449 self.nuninst_arch_report(nuninst, arch) 

450 

451 if self.options.print_uninst: 451 ↛ 452line 451 didn't jump to line 452, because the condition on line 451 was never true

452 print("* summary") 

453 print( 

454 "\n".join( 

455 map( 

456 lambda x: "%4d %s" % (len(nuninst[x]), x), 

457 self.options.architectures, 

458 ) 

459 ) 

460 ) 

461 return 

462 else: 

463 write_nuninst(self.options.noninst_status, nuninst) 

464 

465 stats = self._inst_tester.compute_stats() 

466 self.logger.info("> Installability tester statistics (per architecture)") 

467 for arch in self.options.architectures: 

468 arch_stat = stats[arch] 

469 self.logger.info("> %s", arch) 

470 for stat in arch_stat.stat_summary(): 

471 self.logger.info("> - %s", stat) 

472 else: 

473 self.logger.info("Loading uninstallability counters from cache") 

474 self.nuninst_orig = read_nuninst( 

475 self.options.noninst_status, self.options.architectures 

476 ) 

477 

478 # nuninst_orig may get updated during the upgrade process 

479 self.nuninst_orig_save: dict[str, set[str]] = clone_nuninst( 

480 self.nuninst_orig, architectures=self.options.architectures 

481 ) 

482 

483 self._policy_engine.register_policy_hints(self._hint_parser) 

484 

485 try: 

486 self.read_hints(self.options.hintsdir) 

487 except AttributeError: 

488 self.read_hints(os.path.join(self.suite_info["unstable"].path, "Hints")) 

489 

490 self._policy_engine.initialise(self, self.hints) 

491 

492 def __parse_arguments(self) -> None: 

493 """Parse the command line arguments 

494 

495 This method parses and initializes the command line arguments. 

496 While doing so, it preprocesses some of the options to be converted 

497 in a suitable form for the other methods of the class. 

498 """ 

499 # initialize the parser 

500 parser = optparse.OptionParser(version="%prog") 

501 parser.add_option( 

502 "-v", "", action="count", dest="verbose", help="enable verbose output" 

503 ) 

504 parser.add_option( 

505 "-c", 

506 "--config", 

507 action="store", 

508 dest="config", 

509 default="/etc/britney.conf", 

510 help="path for the configuration file", 

511 ) 

512 parser.add_option( 

513 "", 

514 "--architectures", 

515 action="store", 

516 dest="architectures", 

517 default=None, 

518 help="override architectures from configuration file", 

519 ) 

520 parser.add_option( 

521 "", 

522 "--actions", 

523 action="store", 

524 dest="actions", 

525 default=None, 

526 help="override the list of actions to be performed", 

527 ) 

528 parser.add_option( 

529 "", 

530 "--hints", 

531 action="store", 

532 dest="hints", 

533 default=None, 

534 help="additional hints, separated by semicolons", 

535 ) 

536 parser.add_option( 

537 "", 

538 "--hint-tester", 

539 action="store_true", 

540 dest="hint_tester", 

541 default=None, 

542 help="provide a command line interface to test hints", 

543 ) 

544 parser.add_option( 

545 "", 

546 "--dry-run", 

547 action="store_true", 

548 dest="dry_run", 

549 default=False, 

550 help="disable all outputs to the testing directory", 

551 ) 

552 parser.add_option( 

553 "", 

554 "--nuninst-cache", 

555 action="store_true", 

556 dest="nuninst_cache", 

557 default=False, 

558 help="do not build the non-installability status, use the cache from file", 

559 ) 

560 parser.add_option( 

561 "", 

562 "--print-uninst", 

563 action="store_true", 

564 dest="print_uninst", 

565 default=False, 

566 help="just print a summary of uninstallable packages", 

567 ) 

568 parser.add_option( 

569 "", 

570 "--compute-migrations", 

571 action="store_true", 

572 dest="compute_migrations", 

573 default=True, 

574 help="Compute which packages can migrate (the default)", 

575 ) 

576 parser.add_option( 

577 "", 

578 "--no-compute-migrations", 

579 action="store_false", 

580 dest="compute_migrations", 

581 help="Do not compute which packages can migrate.", 

582 ) 

583 parser.add_option( 

584 "", 

585 "--series", 

586 action="store", 

587 dest="series", 

588 default="", 

589 help="set distribution series name", 

590 ) 

591 parser.add_option( 

592 "", 

593 "--distribution", 

594 action="store", 

595 dest="distribution", 

596 default="debian", 

597 help="set distribution name", 

598 ) 

599 (self.options, self.args) = parser.parse_args() 

600 

601 if self.options.verbose: 601 ↛ 607line 601 didn't jump to line 607, because the condition on line 601 was never false

602 if self.options.verbose > 1: 602 ↛ 603line 602 didn't jump to line 603, because the condition on line 602 was never true

603 self.logger.setLevel(logging.DEBUG) 

604 else: 

605 self.logger.setLevel(logging.INFO) 

606 else: 

607 self.logger.setLevel(logging.WARNING) 

608 # Historical way to get debug information (equivalent to -vv) 

609 try: # pragma: no cover 

610 if int(os.environ.get("BRITNEY_DEBUG", "0")): 

611 self.logger.setLevel(logging.DEBUG) 

612 except ValueError: # pragma: no cover 

613 pass 

614 

615 # integrity checks 

616 if self.options.nuninst_cache and self.options.print_uninst: # pragma: no cover 

617 self.logger.error("nuninst_cache and print_uninst are mutually exclusive!") 

618 sys.exit(1) 

619 

620 # if the configuration file exists, then read it and set the additional options 

621 if not os.path.isfile(self.options.config): # pragma: no cover 

622 self.logger.error( 

623 "Unable to read the configuration file (%s), exiting!", 

624 self.options.config, 

625 ) 

626 sys.exit(1) 

627 

628 self.HINTS: dict[str, Any] = {"command-line": self.HINTS_ALL} 

629 with open(self.options.config, encoding="utf-8") as config: 

630 for line in config: 

631 if "=" in line and not line.strip().startswith("#"): 

632 k, v = line.split("=", 1) 

633 k = k.strip() 

634 v = v.strip() 

635 if k.startswith("HINTS_"): 

636 self.HINTS[k.split("_")[1].lower()] = reduce( 

637 lambda x, y: x + y, 

638 [ 

639 hasattr(self, "HINTS_" + i) 

640 and getattr(self, "HINTS_" + i) 

641 or (i,) 

642 for i in v.split() 

643 ], 

644 ) 

645 elif not hasattr(self.options, k.lower()) or not getattr( 

646 self.options, k.lower() 

647 ): 

648 setattr(self.options, k.lower(), v) 

649 

650 parse_option(self.options, "archall_inconsistency_allowed", to_bool=True) 

651 

652 suite_loader = DebMirrorLikeSuiteContentLoader(self.options) 

653 

654 try: 

655 self.suite_info = suite_loader.load_suites() 

656 except MissingRequiredConfigurationError as e: # pragma: no cover 

657 self.logger.error( 

658 "Could not load the suite content due to missing configuration: %s", 

659 str(e), 

660 ) 

661 sys.exit(1) 

662 self.all_binaries = suite_loader.all_binaries() 

663 self.options.components = suite_loader.components 

664 self.options.architectures = suite_loader.architectures 

665 self.options.nobreakall_arches = suite_loader.nobreakall_arches 

666 self.options.outofsync_arches = suite_loader.outofsync_arches 

667 self.options.break_arches = suite_loader.break_arches 

668 self.options.new_arches = suite_loader.new_arches 

669 if self.options.series == "": 669 ↛ 672line 669 didn't jump to line 672, because the condition on line 669 was never false

670 self.options.series = self.suite_info.target_suite.name 

671 

672 if self.options.heidi_output and not hasattr( 672 ↛ 677line 672 didn't jump to line 677, because the condition on line 672 was never false

673 self.options, "heidi_delta_output" 

674 ): 

675 self.options.heidi_delta_output = self.options.heidi_output + "Delta" 

676 

677 self.options.smooth_updates = self.options.smooth_updates.split() 

678 

679 parse_option(self.options, "ignore_cruft", to_bool=True) 

680 parse_option(self.options, "check_consistency_level", default=2, to_int=True) 

681 parse_option(self.options, "build_url") 

682 

683 self._policy_engine.load_policies( 

684 self.options, self.suite_info, MIGRATION_POLICIES 

685 ) 

686 

687 @property 

688 def hints(self) -> HintCollection: 

689 return self._hint_parser.hints 

690 

691 def _load_faux_packages(self, faux_packages_file: str) -> None: 

692 """Loads fake packages 

693 

694 In rare cases, it is useful to create a "fake" package that can be used to satisfy 

695 dependencies. This is usually needed for packages that are not shipped directly 

696 on this mirror but is a prerequisite for using this mirror (e.g. some vendors provide 

697 non-distributable "setup" packages and contrib/non-free packages depend on these). 

698 

699 :param faux_packages_file: Path to the file containing the fake package definitions 

700 """ 

701 tag_file = apt_pkg.TagFile(faux_packages_file) 

702 get_field = tag_file.section.get 

703 step = tag_file.step 

704 no = 0 

705 pri_source_suite = self.suite_info.primary_source_suite 

706 target_suite = self.suite_info.target_suite 

707 

708 while step(): 

709 no += 1 

710 pkg_name = get_field("Package", None) 

711 if pkg_name is None: # pragma: no cover 

712 raise ValueError( 

713 "Missing Package field in paragraph %d (file %s)" 

714 % (no, faux_packages_file) 

715 ) 

716 pkg_name = sys.intern(pkg_name) 

717 version = sys.intern(get_field("Version", "1.0-1")) 

718 provides_raw = get_field("Provides") 

719 archs_raw = get_field("Architecture", None) 

720 component = get_field("Component", "non-free") 

721 if archs_raw: 721 ↛ 722line 721 didn't jump to line 722, because the condition on line 721 was never true

722 archs = archs_raw.split() 

723 else: 

724 archs = self.options.architectures 

725 faux_section = "faux" 

726 if component != "main": 726 ↛ 728line 726 didn't jump to line 728, because the condition on line 726 was never false

727 faux_section = "%s/faux" % component 

728 src_data = SourcePackage( 

729 pkg_name, 

730 version, 

731 sys.intern(faux_section), 

732 set(), 

733 None, 

734 True, 

735 None, 

736 None, 

737 [], 

738 [], 

739 ) 

740 

741 target_suite.sources[pkg_name] = src_data 

742 pri_source_suite.sources[pkg_name] = src_data 

743 

744 for arch in archs: 

745 pkg_id = BinaryPackageId(pkg_name, version, arch) 

746 if provides_raw: 746 ↛ 747line 746 didn't jump to line 747, because the condition on line 746 was never true

747 provides = parse_provides( 

748 provides_raw, pkg_id=pkg_id, logger=self.logger 

749 ) 

750 else: 

751 provides = [] 

752 bin_data = BinaryPackage( 

753 version, 

754 faux_section, 

755 pkg_name, 

756 version, 

757 arch, 

758 get_field("Multi-Arch"), 

759 None, 

760 None, 

761 provides, 

762 False, 

763 pkg_id, 

764 [], 

765 ) 

766 

767 src_data.binaries.add(pkg_id) 

768 target_suite.binaries[arch][pkg_name] = bin_data 

769 pri_source_suite.binaries[arch][pkg_name] = bin_data 

770 

771 # register provided packages with the target suite provides table 

772 for provided_pkg, provided_version, _ in bin_data.provides: 772 ↛ 773line 772 didn't jump to line 773, because the loop on line 772 never started

773 target_suite.provides_table[arch][provided_pkg].add( 

774 (pkg_name, provided_version) 

775 ) 

776 

777 self.all_binaries[pkg_id] = bin_data 

778 

779 def _load_constraints(self, constraints_file: str) -> dict[str, list[str]]: 

780 """Loads configurable constraints 

781 

782 The constraints file can contain extra rules that Britney should attempt 

783 to satisfy. Examples can be "keep package X in testing and ensure it is 

784 installable". 

785 

786 :param constraints_file: Path to the file containing the constraints 

787 """ 

788 tag_file = apt_pkg.TagFile(constraints_file) 

789 get_field = tag_file.section.get 

790 step = tag_file.step 

791 no = 0 

792 faux_version = sys.intern("1") 

793 faux_section = sys.intern("faux") 

794 keep_installable: list[str] = [] 

795 constraints = {"keep-installable": keep_installable} 

796 pri_source_suite = self.suite_info.primary_source_suite 

797 target_suite = self.suite_info.target_suite 

798 

799 while step(): 

800 no += 1 

801 pkg_name = get_field("Fake-Package-Name", None) 

802 if pkg_name is None: # pragma: no cover 

803 raise ValueError( 

804 "Missing Fake-Package-Name field in paragraph %d (file %s)" 

805 % (no, constraints_file) 

806 ) 

807 pkg_name = sys.intern(pkg_name) 

808 

809 def mandatory_field(x: str) -> str: 

810 v: str = get_field(x, None) 

811 if v is None: # pragma: no cover 

812 raise ValueError( 

813 "Missing %s field for %s (file %s)" 

814 % (x, pkg_name, constraints_file) 

815 ) 

816 return v 

817 

818 constraint = mandatory_field("Constraint") 

819 if constraint not in {"present-and-installable"}: # pragma: no cover 

820 raise ValueError( 

821 "Unsupported constraint %s for %s (file %s)" 

822 % (constraint, pkg_name, constraints_file) 

823 ) 

824 

825 self.logger.info(" - constraint %s", pkg_name) 

826 

827 pkg_list = [ 

828 x.strip() 

829 for x in mandatory_field("Package-List").split("\n") 

830 if x.strip() != "" and not x.strip().startswith("#") 

831 ] 

832 src_data = SourcePackage( 

833 pkg_name, 

834 faux_version, 

835 faux_section, 

836 set(), 

837 None, 

838 True, 

839 None, 

840 None, 

841 [], 

842 [], 

843 ) 

844 target_suite.sources[pkg_name] = src_data 

845 pri_source_suite.sources[pkg_name] = src_data 

846 keep_installable.append(pkg_name) 

847 for arch in self.options.architectures: 

848 deps = [] 

849 for pkg_spec in pkg_list: 

850 s = pkg_spec.split(None, 1) 

851 if len(s) == 1: 

852 deps.append(s[0]) 

853 else: 

854 pkg, arch_res = s 

855 if not ( 

856 arch_res.startswith("[") and arch_res.endswith("]") 

857 ): # pragma: no cover 

858 raise ValueError( 

859 "Invalid arch-restriction on %s - should be [arch1 arch2] (for %s file %s)" 

860 % (pkg, pkg_name, constraints_file) 

861 ) 

862 arch_res_l = arch_res[1:-1].split() 

863 if not arch_res_l: # pragma: no cover 

864 msg = "Empty arch-restriction for %s: Uses comma or negation (for %s file %s)" 

865 raise ValueError(msg % (pkg, pkg_name, constraints_file)) 

866 for a in arch_res_l: 

867 if a == arch: 

868 deps.append(pkg) 

869 elif "," in a or "!" in a: # pragma: no cover 

870 msg = "Invalid arch-restriction for %s: Uses comma or negation (for %s file %s)" 

871 raise ValueError( 

872 msg % (pkg, pkg_name, constraints_file) 

873 ) 

874 pkg_id = BinaryPackageId(pkg_name, faux_version, arch) 

875 bin_data = BinaryPackage( 

876 faux_version, 

877 faux_section, 

878 pkg_name, 

879 faux_version, 

880 arch, 

881 "no", 

882 ", ".join(deps), 

883 None, 

884 [], 

885 False, 

886 pkg_id, 

887 [], 

888 ) 

889 src_data.binaries.add(pkg_id) 

890 target_suite.binaries[arch][pkg_name] = bin_data 

891 pri_source_suite.binaries[arch][pkg_name] = bin_data 

892 self.all_binaries[pkg_id] = bin_data 

893 

894 return constraints 

895 

896 # Data reading/writing methods 

897 # ---------------------------- 

898 

899 def read_hints(self, hintsdir: str) -> None: 

900 """Read the hint commands from the specified directory 

901 

902 The hint commands are read from the files contained in the directory 

903 specified by the `hintsdir' parameter. 

904 The names of the files have to be the same as the authorized users 

905 for the hints. 

906 

907 The file contains rows with the format: 

908 

909 <command> <package-name>[/<version>] 

910 

911 The method returns a dictionary where the key is the command, and 

912 the value is the list of affected packages. 

913 """ 

914 

915 for who in self.HINTS.keys(): 

916 if who == "command-line": 

917 lines = self.options.hints and self.options.hints.split(";") or () 

918 filename = "<cmd-line>" 

919 self._hint_parser.parse_hints(who, self.HINTS[who], filename, lines) 

920 else: 

921 filename = os.path.join(hintsdir, who) 

922 if not os.path.isfile(filename): 922 ↛ 923line 922 didn't jump to line 923, because the condition on line 922 was never true

923 self.logger.error( 

924 "Cannot read hints list from %s, no such file!", filename 

925 ) 

926 continue 

927 self.logger.info("Loading hints list from %s", filename) 

928 with open(filename, encoding="utf-8") as f: 

929 self._hint_parser.parse_hints(who, self.HINTS[who], filename, f) 

930 

931 hints = self._hint_parser.hints 

932 

933 for x in [ 

934 "block", 

935 "block-all", 

936 "block-udeb", 

937 "unblock", 

938 "unblock-udeb", 

939 "force", 

940 "urgent", 

941 "remove", 

942 "age-days", 

943 ]: 

944 z: dict[Optional[str], dict[Optional[str], tuple[Hint, str]]] = defaultdict( 

945 dict 

946 ) 

947 for hint in hints[x]: 

948 package = hint.package 

949 architecture = hint.architecture 

950 key = (hint, hint.user) 

951 if ( 

952 package in z 

953 and architecture in z[package] 

954 and z[package][architecture] != key 

955 ): 

956 hint2 = z[package][architecture][0] 

957 if x in ["unblock", "unblock-udeb"]: 

958 assert hint.version is not None 

959 assert hint2.version is not None 

960 if apt_pkg.version_compare(hint2.version, hint.version) < 0: 

961 # This hint is for a newer version, so discard the old one 

962 self.logger.warning( 

963 "Overriding %s[%s] = ('%s', '%s', '%s') with ('%s', '%s', '%s')", 

964 x, 

965 package, 

966 hint2.version, 

967 hint2.architecture, 

968 hint2.user, 

969 hint.version, 

970 hint.architecture, 

971 hint.user, 

972 ) 

973 hint2.set_active(False) 

974 else: 

975 # This hint is for an older version, so ignore it in favour of the new one 

976 self.logger.warning( 

977 "Ignoring %s[%s] = ('%s', '%s', '%s'), ('%s', '%s', '%s') is higher or equal", 

978 x, 

979 package, 

980 hint.version, 

981 hint.architecture, 

982 hint.user, 

983 hint2.version, 

984 hint2.architecture, 

985 hint2.user, 

986 ) 

987 hint.set_active(False) 

988 else: 

989 self.logger.warning( 

990 "Overriding %s[%s] = ('%s', '%s') with ('%s', '%s')", 

991 x, 

992 package, 

993 hint2.user, 

994 hint2, 

995 hint.user, 

996 hint, 

997 ) 

998 hint2.set_active(False) 

999 

1000 z[package][architecture] = key 

1001 

1002 for hint in hints["allow-uninst"]: 

1003 if hint.architecture == "source": 

1004 for arch in self.options.architectures: 

1005 self.allow_uninst[arch].add(hint.package) 

1006 else: 

1007 assert hint.architecture is not None 

1008 self.allow_uninst[hint.architecture].add(hint.package) 

1009 

1010 # Sanity check the hints hash 

1011 if len(hints["block"]) == 0 and len(hints["block-udeb"]) == 0: 

1012 self.logger.warning("WARNING: No block hints at all, not even udeb ones!") 

1013 

1014 def write_excuses(self) -> None: 

1015 """Produce and write the update excuses 

1016 

1017 This method handles the update excuses generation: the packages are 

1018 looked at to determine whether they are valid candidates. For the details 

1019 of this procedure, please refer to the module docstring. 

1020 """ 

1021 

1022 self.logger.info("Update Excuses generation started") 

1023 

1024 mi_factory = self._migration_item_factory 

1025 excusefinder = ExcuseFinder( 

1026 self.options, 

1027 self.suite_info, 

1028 self.all_binaries, 

1029 self.pkg_universe, 

1030 self._policy_engine, 

1031 mi_factory, 

1032 self.hints, 

1033 ) 

1034 

1035 excuses, upgrade_me = excusefinder.find_actionable_excuses() 

1036 self.excuses = excuses 

1037 

1038 # sort the list of candidates 

1039 self.upgrade_me = sorted(upgrade_me) 

1040 old_lib_removals = old_libraries( 

1041 mi_factory, self.suite_info, self.options.outofsync_arches 

1042 ) 

1043 self.upgrade_me.extend(old_lib_removals) 

1044 self.output_logger.info( 

1045 "List of old libraries added to upgrade_me (%d):", len(old_lib_removals) 

1046 ) 

1047 log_and_format_old_libraries(self.output_logger, old_lib_removals) 

1048 

1049 # write excuses to the output file 

1050 if not self.options.dry_run: 1050 ↛ 1063line 1050 didn't jump to line 1063, because the condition on line 1050 was never false

1051 self.logger.info("> Writing Excuses to %s", self.options.excuses_output) 

1052 write_excuses( 

1053 excuses, self.options.excuses_output, output_format="legacy-html" 

1054 ) 

1055 if hasattr(self.options, "excuses_yaml_output"): 1055 ↛ 1063line 1055 didn't jump to line 1063, because the condition on line 1055 was never false

1056 self.logger.info( 

1057 "> Writing YAML Excuses to %s", self.options.excuses_yaml_output 

1058 ) 

1059 write_excuses( 

1060 excuses, self.options.excuses_yaml_output, output_format="yaml" 

1061 ) 

1062 

1063 self.logger.info("Update Excuses generation completed") 

1064 

1065 # Upgrade run 

1066 # ----------- 

1067 

1068 def eval_nuninst( 

1069 self, 

1070 nuninst: dict[str, set[str]], 

1071 original: Optional[dict[str, set[str]]] = None, 

1072 ) -> str: 

1073 """Return a string which represents the uninstallability counters 

1074 

1075 This method returns a string which represents the uninstallability 

1076 counters reading the uninstallability statistics `nuninst` and, if 

1077 present, merging the results with the `original` one. 

1078 

1079 An example of the output string is: 

1080 1+2: i-0:a-0:a-0:h-0:i-1:m-0:m-0:p-0:a-0:m-0:s-2:s-0 

1081 

1082 where the first part is the number of broken packages in non-break 

1083 architectures + the total number of broken packages for all the 

1084 architectures. 

1085 """ 

1086 res = [] 

1087 total = 0 

1088 totalbreak = 0 

1089 for arch in self.options.architectures: 

1090 if arch in nuninst: 1090 ↛ 1092line 1090 didn't jump to line 1092, because the condition on line 1090 was never false

1091 n = len(nuninst[arch]) 

1092 elif original and arch in original: 

1093 n = len(original[arch]) 

1094 else: 

1095 continue 

1096 if arch in self.options.break_arches: 

1097 totalbreak = totalbreak + n 

1098 else: 

1099 total = total + n 

1100 res.append("%s-%d" % (arch[0], n)) 

1101 return "%d+%d: %s" % (total, totalbreak, ":".join(res)) 

1102 

1103 def iter_packages( 

1104 self, 

1105 packages: list[MigrationItem], 

1106 selected: list[MigrationItem], 

1107 nuninst: Optional[dict[str, set[str]]] = None, 

1108 ) -> tuple[Optional[dict[str, set[str]]], list[MigrationItem]]: 

1109 """Iter on the list of actions and apply them one-by-one 

1110 

1111 This method applies the changes from `packages` to testing, checking the uninstallability 

1112 counters for every action performed. If the action does not improve them, it is reverted. 

1113 The method returns the new uninstallability counters and the remaining actions if the 

1114 final result is successful, otherwise (None, []). 

1115 

1116 :param selected: list of MigrationItem? 

1117 :param nuninst: dict with sets ? of ? per architecture 

1118 """ 

1119 assert self.suite_info is not None # for type checking 

1120 group_info = {} 

1121 rescheduled_packages = packages 

1122 maybe_rescheduled_packages: list[MigrationItem] = [] 

1123 output_logger = self.output_logger 

1124 solver = InstallabilitySolver(self.pkg_universe, self._inst_tester) 

1125 mm = self._migration_manager 

1126 target_suite = self.suite_info.target_suite 

1127 

1128 for y in sorted((y for y in packages), key=attrgetter("uvname")): 

1129 try: 

1130 _, updates, rms, _ = mm.compute_groups(y) 

1131 result = (y, sorted(updates), sorted(rms)) 

1132 group_info[y] = result 

1133 except MigrationConstraintException as e: 

1134 rescheduled_packages.remove(y) 

1135 output_logger.info("not adding package to list: %s", (y.package)) 

1136 output_logger.info(" got exception: %s" % (repr(e))) 

1137 

1138 if nuninst: 

1139 nuninst_orig = nuninst 

1140 else: 

1141 nuninst_orig = self.nuninst_orig 

1142 

1143 nuninst_last_accepted = nuninst_orig 

1144 

1145 output_logger.info( 

1146 "recur: [] %s %d/0", ",".join(x.uvname for x in selected), len(packages) 

1147 ) 

1148 while rescheduled_packages: 

1149 groups = [group_info[x] for x in rescheduled_packages] 

1150 worklist = solver.solve_groups(groups) 

1151 rescheduled_packages = [] 

1152 

1153 worklist.reverse() 

1154 

1155 while worklist: 

1156 comp = worklist.pop() 

1157 comp_name = " ".join(item.uvname for item in comp) 

1158 output_logger.info("trying: %s" % comp_name) 

1159 with mm.start_transaction() as transaction: 

1160 accepted = False 

1161 try: 

1162 ( 

1163 accepted, 

1164 nuninst_after, 

1165 failed_arch, 

1166 new_cruft, 

1167 ) = mm.migrate_items_to_target_suite( 

1168 comp, nuninst_last_accepted 

1169 ) 

1170 if accepted: 

1171 selected.extend(comp) 

1172 transaction.commit() 

1173 output_logger.info("accepted: %s", comp_name) 

1174 output_logger.info( 

1175 " ori: %s", self.eval_nuninst(nuninst_orig) 

1176 ) 

1177 output_logger.info( 

1178 " pre: %s", self.eval_nuninst(nuninst_last_accepted) 

1179 ) 

1180 output_logger.info( 

1181 " now: %s", self.eval_nuninst(nuninst_after) 

1182 ) 

1183 if new_cruft: 

1184 output_logger.info( 

1185 " added new cruft items to list: %s", 

1186 " ".join(x.uvname for x in sorted(new_cruft)), 

1187 ) 

1188 

1189 if len(selected) <= 20: 

1190 output_logger.info( 

1191 " all: %s", " ".join(x.uvname for x in selected) 

1192 ) 

1193 else: 

1194 output_logger.info( 

1195 " most: (%d) .. %s", 

1196 len(selected), 

1197 " ".join(x.uvname for x in selected[-20:]), 

1198 ) 

1199 if self.options.check_consistency_level >= 3: 

1200 target_suite.check_suite_source_pkg_consistency( 

1201 "iter_packages after commit" 

1202 ) 

1203 nuninst_last_accepted = nuninst_after 

1204 for cruft_item in new_cruft: 

1205 try: 

1206 _, updates, rms, _ = mm.compute_groups(cruft_item) 

1207 result = (cruft_item, sorted(updates), sorted(rms)) 

1208 group_info[cruft_item] = result 

1209 worklist.append([cruft_item]) 

1210 except MigrationConstraintException as e: 

1211 output_logger.info( 

1212 " got exception adding cruft item %s to list: %s" 

1213 % (cruft_item.uvname, repr(e)) 

1214 ) 

1215 rescheduled_packages.extend(maybe_rescheduled_packages) 

1216 maybe_rescheduled_packages.clear() 

1217 else: 

1218 transaction.rollback() 

1219 broken = sorted( 

1220 b 

1221 for b in nuninst_after[failed_arch] 

1222 if b not in nuninst_last_accepted[failed_arch] 

1223 ) 

1224 compare_nuninst = None 

1225 if any( 

1226 item for item in comp if item.architecture != "source" 

1227 ): 

1228 compare_nuninst = nuninst_last_accepted 

1229 # NB: try_migration already reverted this for us, so just print the results and move on 

1230 output_logger.info( 

1231 "skipped: %s (%d, %d, %d)", 

1232 comp_name, 

1233 len(rescheduled_packages), 

1234 len(maybe_rescheduled_packages), 

1235 len(worklist), 

1236 ) 

1237 output_logger.info( 

1238 " got: %s", 

1239 self.eval_nuninst(nuninst_after, compare_nuninst), 

1240 ) 

1241 output_logger.info( 

1242 " * %s: %s", failed_arch, ", ".join(broken) 

1243 ) 

1244 if self.options.check_consistency_level >= 3: 

1245 target_suite.check_suite_source_pkg_consistency( 

1246 "iter_package after rollback (not accepted)" 

1247 ) 

1248 

1249 except MigrationConstraintException as e: 

1250 transaction.rollback() 

1251 output_logger.info( 

1252 "skipped: %s (%d, %d, %d)", 

1253 comp_name, 

1254 len(rescheduled_packages), 

1255 len(maybe_rescheduled_packages), 

1256 len(worklist), 

1257 ) 

1258 output_logger.info(" got exception: %s" % (repr(e))) 

1259 if self.options.check_consistency_level >= 3: 1259 ↛ 1264line 1259 didn't jump to line 1264, because the condition on line 1259 was never false

1260 target_suite.check_suite_source_pkg_consistency( 

1261 "iter_package after rollback (MigrationConstraintException)" 

1262 ) 

1263 

1264 if not accepted: 

1265 if len(comp) > 1: 

1266 output_logger.info( 

1267 " - splitting the component into single items and retrying them" 

1268 ) 

1269 worklist.extend([item] for item in comp) 

1270 else: 

1271 maybe_rescheduled_packages.append(comp[0]) 

1272 

1273 output_logger.info(" finish: [%s]", ",".join(x.uvname for x in selected)) 

1274 output_logger.info("endloop: %s", self.eval_nuninst(self.nuninst_orig)) 

1275 output_logger.info(" now: %s", self.eval_nuninst(nuninst_last_accepted)) 

1276 format_and_log_uninst( 

1277 output_logger, 

1278 self.options.architectures, 

1279 newly_uninst(self.nuninst_orig, nuninst_last_accepted), 

1280 ) 

1281 output_logger.info("") 

1282 

1283 return (nuninst_last_accepted, maybe_rescheduled_packages) 

1284 

1285 def do_all( 

1286 self, 

1287 hinttype: Optional[str] = None, 

1288 init: Optional[list[MigrationItem]] = None, 

1289 actions: Optional[list[MigrationItem]] = None, 

1290 ) -> None: 

1291 """Testing update runner 

1292 

1293 This method tries to update testing checking the uninstallability 

1294 counters before and after the actions to decide if the update was 

1295 successful or not. 

1296 """ 

1297 selected = [] 

1298 if actions: 

1299 upgrade_me = actions[:] 

1300 else: 

1301 upgrade_me = self.upgrade_me[:] 

1302 nuninst_start = self.nuninst_orig 

1303 output_logger = self.output_logger 

1304 target_suite = self.suite_info.target_suite 

1305 

1306 # these are special parameters for hints processing 

1307 force = False 

1308 recurse = True 

1309 nuninst_end = None 

1310 extra: list[MigrationItem] = [] 

1311 mm = self._migration_manager 

1312 

1313 if hinttype == "easy" or hinttype == "force-hint": 

1314 force = hinttype == "force-hint" 

1315 recurse = False 

1316 

1317 # if we have a list of initial packages, check them 

1318 if init: 

1319 for x in init: 

1320 if x not in upgrade_me: 

1321 output_logger.warning( 

1322 "failed: %s is not a valid candidate (or it already migrated)", 

1323 x.uvname, 

1324 ) 

1325 return None 

1326 selected.append(x) 

1327 upgrade_me.remove(x) 

1328 

1329 output_logger.info("start: %s", self.eval_nuninst(nuninst_start)) 

1330 output_logger.info("orig: %s", self.eval_nuninst(nuninst_start)) 

1331 

1332 if not (init and not force): 

1333 # No "outer" transaction needed as we will never need to rollback 

1334 # (e.g. "force-hint" or a regular "main run"). Emulate the start_transaction 

1335 # call from the MigrationManager, so the rest of the code follows the 

1336 # same flow regardless of whether we need the transaction or not. 

1337 

1338 @contextlib.contextmanager 

1339 def _start_transaction() -> Iterator[Optional["MigrationTransactionState"]]: 

1340 yield None 

1341 

1342 else: 

1343 # We will need to be able to roll back (e.g. easy or a "hint"-hint) 

1344 _start_transaction = mm.start_transaction 

1345 

1346 with _start_transaction() as transaction: 

1347 if init: 

1348 # init => a hint (e.g. "easy") - so do the hint run 

1349 (_, nuninst_end, _, new_cruft) = mm.migrate_items_to_target_suite( 

1350 selected, self.nuninst_orig, stop_on_first_regression=False 

1351 ) 

1352 

1353 if recurse: 

1354 # Ensure upgrade_me and selected do not overlap, if we 

1355 # follow-up with a recurse ("hint"-hint). 

1356 upgrade_me = [x for x in upgrade_me if x not in set(selected)] 

1357 else: 

1358 # On non-recursive hints check for cruft and purge it proactively in case it "fixes" the hint. 

1359 cruft = [x for x in upgrade_me if x.is_cruft_removal] 

1360 if new_cruft: 

1361 output_logger.info( 

1362 "Change added new cruft items to list: %s", 

1363 " ".join(x.uvname for x in sorted(new_cruft)), 

1364 ) 

1365 cruft.extend(new_cruft) 

1366 if cruft: 

1367 output_logger.info("Checking if changes enables cruft removal") 

1368 (nuninst_end, remaining_cruft) = self.iter_packages( 

1369 cruft, selected, nuninst=nuninst_end 

1370 ) 

1371 output_logger.info( 

1372 "Removed %d of %d cruft item(s) after the changes", 

1373 len(cruft) - len(remaining_cruft), 

1374 len(cruft), 

1375 ) 

1376 new_cruft.difference_update(remaining_cruft) 

1377 

1378 # Add new cruft items regardless of whether we recurse. A future run might clean 

1379 # them for us. 

1380 upgrade_me.extend(new_cruft) 

1381 

1382 if recurse: 

1383 # Either the main run or the recursive run of a "hint"-hint. 

1384 (nuninst_end, extra) = self.iter_packages( 

1385 upgrade_me, selected, nuninst=nuninst_end 

1386 ) 

1387 

1388 assert nuninst_end is not None 

1389 nuninst_end_str = self.eval_nuninst(nuninst_end) 

1390 

1391 if not recurse: 

1392 # easy or force-hint 

1393 output_logger.info("easy: %s", nuninst_end_str) 

1394 

1395 if not force: 

1396 format_and_log_uninst( 

1397 self.output_logger, 

1398 self.options.architectures, 

1399 newly_uninst(nuninst_start, nuninst_end), 

1400 ) 

1401 

1402 if force: 

1403 # Force implies "unconditionally better" 

1404 better = True 

1405 else: 

1406 break_arches: set[str] = set(self.options.break_arches) 

1407 if all(x.architecture in break_arches for x in selected): 

1408 # If we only migrated items from break-arches, then we 

1409 # do not allow any regressions on these architectures. 

1410 # This usually only happens with hints 

1411 break_arches = set() 

1412 better = is_nuninst_asgood_generous( 

1413 self.constraints, 

1414 self.allow_uninst, 

1415 self.options.architectures, 

1416 self.nuninst_orig, 

1417 nuninst_end, 

1418 break_arches, 

1419 ) 

1420 

1421 if better: 

1422 # Result accepted either by force or by being better than the original result. 

1423 output_logger.info( 

1424 "final: %s", ",".join(sorted(x.uvname for x in selected)) 

1425 ) 

1426 output_logger.info("start: %s", self.eval_nuninst(nuninst_start)) 

1427 output_logger.info(" orig: %s", self.eval_nuninst(self.nuninst_orig)) 

1428 output_logger.info(" end: %s", nuninst_end_str) 

1429 if force: 

1430 broken = newly_uninst(nuninst_start, nuninst_end) 

1431 if broken: 

1432 output_logger.warning("force breaks:") 

1433 format_and_log_uninst( 

1434 self.output_logger, 

1435 self.options.architectures, 

1436 broken, 

1437 loglevel=logging.WARNING, 

1438 ) 

1439 else: 

1440 output_logger.info("force did not break any packages") 

1441 output_logger.info( 

1442 "SUCCESS (%d/%d)", len(actions or self.upgrade_me), len(extra) 

1443 ) 

1444 self.nuninst_orig = nuninst_end 

1445 self.all_selected += selected 

1446 if transaction: 

1447 transaction.commit() 

1448 if self.options.check_consistency_level >= 2: 1448 ↛ 1452line 1448 didn't jump to line 1452, because the condition on line 1448 was never false

1449 target_suite.check_suite_source_pkg_consistency( 

1450 "do_all after commit" 

1451 ) 

1452 if not actions: 

1453 if recurse: 

1454 self.upgrade_me = extra 

1455 else: 

1456 self.upgrade_me = [ 

1457 x for x in self.upgrade_me if x not in set(selected) 

1458 ] 

1459 else: 

1460 output_logger.info("FAILED\n") 

1461 if not transaction: 1461 ↛ 1465line 1461 didn't jump to line 1465, because the condition on line 1461 was never true

1462 # if we 'FAILED', but we cannot rollback, we will probably 

1463 # leave a broken state behind 

1464 # this should not happen 

1465 raise AssertionError("do_all FAILED but no transaction to rollback") 

1466 transaction.rollback() 

1467 if self.options.check_consistency_level >= 2: 1467 ↛ 1346line 1467 didn't jump to line 1346

1468 target_suite.check_suite_source_pkg_consistency( 

1469 "do_all after rollback" 

1470 ) 

1471 

1472 output_logger.info("") 

1473 

1474 def assert_nuninst_is_correct(self) -> None: 

1475 self.logger.info("> Update complete - Verifying non-installability counters") 

1476 

1477 cached_nuninst = self.nuninst_orig 

1478 self._inst_tester.compute_installability() 

1479 computed_nuninst = compile_nuninst( 

1480 self.suite_info.target_suite, 

1481 self.options.architectures, 

1482 self.options.nobreakall_arches, 

1483 ) 

1484 if cached_nuninst != computed_nuninst: # pragma: no cover 

1485 only_on_break_archs = True 

1486 self.logger.error( 

1487 "==================== NUNINST OUT OF SYNC =========================" 

1488 ) 

1489 for arch in self.options.architectures: 

1490 expected_nuninst = set(cached_nuninst[arch]) 

1491 actual_nuninst = set(computed_nuninst[arch]) 

1492 false_negatives = actual_nuninst - expected_nuninst 

1493 false_positives = expected_nuninst - actual_nuninst 

1494 # Britney does not quite work correctly with 

1495 # break/fucked arches, so ignore issues there for now. 

1496 if ( 

1497 false_negatives or false_positives 

1498 ) and arch not in self.options.break_arches: 

1499 only_on_break_archs = False 

1500 if false_negatives: 

1501 self.logger.error( 

1502 " %s - unnoticed nuninst: %s", arch, str(false_negatives) 

1503 ) 

1504 if false_positives: 

1505 self.logger.error( 

1506 " %s - invalid nuninst: %s", arch, str(false_positives) 

1507 ) 

1508 if false_negatives or false_positives: 

1509 self.logger.info( 

1510 " %s - actual nuninst: %s", arch, str(sorted(actual_nuninst)) 

1511 ) 

1512 self.logger.error( 

1513 "==================== NUNINST OUT OF SYNC =========================" 

1514 ) 

1515 if not only_on_break_archs: 

1516 raise AssertionError("NUNINST OUT OF SYNC") 

1517 else: 

1518 self.logger.warning("Nuninst is out of sync on some break arches") 

1519 

1520 self.logger.info("> All non-installability counters are ok") 

1521 

1522 def upgrade_testing(self) -> None: 

1523 """Upgrade testing using the packages from the source suites 

1524 

1525 This method tries to upgrade testing using the packages from the 

1526 source suites. 

1527 Before running the do_all method, it tries the easy and force-hint 

1528 commands. 

1529 """ 

1530 

1531 output_logger = self.output_logger 

1532 self.logger.info("Starting the upgrade test") 

1533 output_logger.info( 

1534 "Generated on: %s", 

1535 time.strftime("%Y.%m.%d %H:%M:%S %z", time.gmtime(time.time())), 

1536 ) 

1537 output_logger.info("Arch order is: %s", ", ".join(self.options.architectures)) 

1538 

1539 if not self.options.actions: 1539 ↛ 1550line 1539 didn't jump to line 1550, because the condition on line 1539 was never false

1540 # process `easy' hints 

1541 for x in self.hints["easy"]: 

1542 self.do_hint("easy", x.user, x.packages) 

1543 

1544 # process `force-hint' hints 

1545 for x in self.hints["force-hint"]: 

1546 self.do_hint("force-hint", x.user, x.packages) 

1547 

1548 # run the first round of the upgrade 

1549 # - do separate runs for break arches 

1550 allpackages = [] 

1551 normpackages = self.upgrade_me[:] 

1552 archpackages = {} 

1553 for a in self.options.break_arches: 

1554 archpackages[a] = [p for p in normpackages if p.architecture == a] 

1555 normpackages = [p for p in normpackages if p not in archpackages[a]] 

1556 self.upgrade_me = normpackages 

1557 output_logger.info("info: main run") 

1558 self.do_all() 

1559 allpackages += self.upgrade_me 

1560 for a in self.options.break_arches: 

1561 backup = self.options.break_arches 

1562 self.options.break_arches = " ".join( 

1563 x for x in self.options.break_arches if x != a 

1564 ) 

1565 self.upgrade_me = archpackages[a] 

1566 output_logger.info("info: broken arch run for %s", a) 

1567 self.do_all() 

1568 allpackages += self.upgrade_me 

1569 self.options.break_arches = backup 

1570 self.upgrade_me = allpackages 

1571 

1572 if self.options.actions: 1572 ↛ 1573line 1572 didn't jump to line 1573, because the condition on line 1572 was never true

1573 self.printuninstchange() 

1574 return 

1575 

1576 # process `hint' hints 

1577 hintcnt = 0 

1578 for x in self.hints["hint"][:50]: 

1579 if hintcnt > 50: 1579 ↛ 1580line 1579 didn't jump to line 1580, because the condition on line 1579 was never true

1580 output_logger.info("Skipping remaining hints...") 

1581 break 

1582 if self.do_hint("hint", x.user, x.packages): 1582 ↛ 1578line 1582 didn't jump to line 1578, because the condition on line 1582 was never false

1583 hintcnt += 1 

1584 

1585 # run the auto hinter 

1586 self.run_auto_hinter() 

1587 

1588 if getattr(self.options, "remove_obsolete", "yes") == "yes": 

1589 # obsolete source packages 

1590 # a package is obsolete if none of the binary packages in testing 

1591 # are built by it 

1592 self.logger.info( 

1593 "> Removing obsolete source packages from the target suite" 

1594 ) 

1595 # local copies for performance 

1596 target_suite = self.suite_info.target_suite 

1597 sources_t = target_suite.sources 

1598 binaries_t = target_suite.binaries 

1599 mi_factory = self._migration_item_factory 

1600 used = set( 

1601 binaries_t[arch][binary].source 

1602 for arch in binaries_t 

1603 for binary in binaries_t[arch] 

1604 ) 

1605 removals = [ 

1606 mi_factory.parse_item( 

1607 "-%s/%s" % (source, sources_t[source].version), auto_correct=False 

1608 ) 

1609 for source in sources_t 

1610 if source not in used 

1611 ] 

1612 if removals: 

1613 output_logger.info( 

1614 "Removing obsolete source packages from the target suite (%d):", 

1615 len(removals), 

1616 ) 

1617 self.do_all(actions=removals) 

1618 

1619 # smooth updates 

1620 removals = old_libraries( 

1621 self._migration_item_factory, self.suite_info, self.options.outofsync_arches 

1622 ) 

1623 if removals: 

1624 output_logger.info( 

1625 "Removing packages left in the target suite (e.g. smooth updates or cruft)" 

1626 ) 

1627 log_and_format_old_libraries(self.output_logger, removals) 

1628 self.do_all(actions=removals) 

1629 removals = old_libraries( 

1630 self._migration_item_factory, 

1631 self.suite_info, 

1632 self.options.outofsync_arches, 

1633 ) 

1634 

1635 output_logger.info( 

1636 "List of old libraries in the target suite (%d):", len(removals) 

1637 ) 

1638 log_and_format_old_libraries(self.output_logger, removals) 

1639 

1640 self.printuninstchange() 

1641 if self.options.check_consistency_level >= 1: 1641 ↛ 1647line 1641 didn't jump to line 1647, because the condition on line 1641 was never false

1642 target_suite = self.suite_info.target_suite 

1643 self.assert_nuninst_is_correct() 

1644 target_suite.check_suite_source_pkg_consistency("end") 

1645 

1646 # output files 

1647 if self.options.heidi_output and not self.options.dry_run: 1647 ↛ 1661line 1647 didn't jump to line 1661, because the condition on line 1647 was never false

1648 target_suite = self.suite_info.target_suite 

1649 

1650 # write HeidiResult 

1651 self.logger.info("Writing Heidi results to %s", self.options.heidi_output) 

1652 write_heidi( 

1653 self.options.heidi_output, 

1654 target_suite, 

1655 outofsync_arches=self.options.outofsync_arches, 

1656 ) 

1657 

1658 self.logger.info("Writing delta to %s", self.options.heidi_delta_output) 

1659 write_heidi_delta(self.options.heidi_delta_output, self.all_selected) 

1660 

1661 self.logger.info("Test completed!") 

1662 

1663 def printuninstchange(self) -> None: 

1664 self.logger.info("Checking for newly uninstallable packages") 

1665 uninst = newly_uninst(self.nuninst_orig_save, self.nuninst_orig) 

1666 

1667 if uninst: 

1668 self.output_logger.warning("") 

1669 self.output_logger.warning( 

1670 "Newly uninstallable packages in the target suite:" 

1671 ) 

1672 format_and_log_uninst( 

1673 self.output_logger, 

1674 self.options.architectures, 

1675 uninst, 

1676 loglevel=logging.WARNING, 

1677 ) 

1678 

1679 def hint_tester(self) -> None: 

1680 """Run a command line interface to test hints 

1681 

1682 This method provides a command line interface for the release team to 

1683 try hints and evaluate the results. 

1684 """ 

1685 import readline 

1686 

1687 from britney2.completer import Completer 

1688 

1689 histfile = os.path.expanduser("~/.britney2_history") 

1690 if os.path.exists(histfile): 

1691 readline.read_history_file(histfile) 

1692 

1693 readline.parse_and_bind("tab: complete") 

1694 readline.set_completer(Completer(self).completer) 

1695 # Package names can contain "-" and we use "/" in our presentation of them as well, 

1696 # so ensure readline does not split on these characters. 

1697 readline.set_completer_delims( 

1698 readline.get_completer_delims().replace("-", "").replace("/", "") 

1699 ) 

1700 

1701 known_hints = self._hint_parser.registered_hints 

1702 

1703 print("Britney hint tester") 

1704 print() 

1705 print( 

1706 "Besides inputting known britney hints, the follow commands are also available" 

1707 ) 

1708 print(" * quit/exit - terminates the shell") 

1709 print( 

1710 " * python-console - jump into an interactive python shell (with the current loaded dataset)" 

1711 ) 

1712 print() 

1713 

1714 while True: 

1715 # read the command from the command line 

1716 try: 

1717 user_input = input("britney> ").split() 

1718 except EOFError: 

1719 print("") 

1720 break 

1721 except KeyboardInterrupt: 

1722 print("") 

1723 continue 

1724 # quit the hint tester 

1725 if user_input and user_input[0] in ("quit", "exit"): 

1726 break 

1727 elif user_input and user_input[0] == "python-console": 

1728 try: 

1729 import britney2.console 

1730 except ImportError as e: 

1731 print("Failed to import britney.console module: %s" % repr(e)) 

1732 continue 

1733 britney2.console.run_python_console(self) 

1734 print("Returning to the britney hint-tester console") 

1735 # run a hint 

1736 elif user_input and user_input[0] in ("easy", "hint", "force-hint"): 

1737 mi_factory = self._migration_item_factory 

1738 try: 

1739 self.do_hint( 

1740 user_input[0], 

1741 "hint-tester", 

1742 mi_factory.parse_items(user_input[1:]), 

1743 ) 

1744 self.printuninstchange() 

1745 except KeyboardInterrupt: 

1746 continue 

1747 elif user_input and user_input[0] in known_hints: 

1748 self._hint_parser.parse_hints( 

1749 "hint-tester", self.HINTS_ALL, "<stdin>", [" ".join(user_input)] 

1750 ) 

1751 self.write_excuses() 

1752 

1753 try: 

1754 readline.write_history_file(histfile) 

1755 except IOError as e: 

1756 self.logger.warning("Could not write %s: %s", histfile, e) 

1757 

1758 def do_hint(self, hinttype: str, who: str, pkgvers: list[MigrationItem]) -> bool: 

1759 """Process hints 

1760 

1761 This method process `easy`, `hint` and `force-hint` hints. If the 

1762 requested version is not in the relevant source suite, then the hint 

1763 is skipped. 

1764 """ 

1765 

1766 output_logger = self.output_logger 

1767 

1768 self.logger.info("> Processing '%s' hint from %s", hinttype, who) 

1769 output_logger.info( 

1770 "Trying %s from %s: %s", 

1771 hinttype, 

1772 who, 

1773 " ".join("%s/%s" % (x.uvname, x.version) for x in pkgvers), 

1774 ) 

1775 

1776 issues = [] 

1777 # loop on the requested packages and versions 

1778 for idx in range(len(pkgvers)): 

1779 pkg = pkgvers[idx] 

1780 # skip removal requests 

1781 if pkg.is_removal: 

1782 continue 

1783 

1784 suite = pkg.suite 

1785 

1786 assert pkg.version is not None 

1787 if pkg.package not in suite.sources: 1787 ↛ 1788line 1787 didn't jump to line 1788, because the condition on line 1787 was never true

1788 issues.append( 

1789 "Source %s has no version in %s" % (pkg.package, suite.name) 

1790 ) 

1791 elif ( 1791 ↛ 1795line 1791 didn't jump to line 1795

1792 apt_pkg.version_compare(suite.sources[pkg.package].version, pkg.version) 

1793 != 0 

1794 ): 

1795 issues.append( 

1796 "Version mismatch, %s %s != %s" 

1797 % (pkg.package, pkg.version, suite.sources[pkg.package].version) 

1798 ) 

1799 if issues: 1799 ↛ 1800line 1799 didn't jump to line 1800, because the condition on line 1799 was never true

1800 output_logger.warning("%s: Not using hint", ", ".join(issues)) 

1801 return False 

1802 

1803 self.do_all(hinttype, pkgvers) 

1804 return True 

1805 

1806 def get_auto_hinter_hints( 

1807 self, upgrade_me: list[MigrationItem] 

1808 ) -> list[list[frozenset[MigrationItem]]]: 

1809 """Auto-generate "easy" hints. 

1810 

1811 This method attempts to generate "easy" hints for sets of packages which 

1812 must migrate together. Beginning with a package which does not depend on 

1813 any other package (in terms of excuses), a list of dependencies and 

1814 reverse dependencies is recursively created. 

1815 

1816 Once all such lists have been generated, any which are subsets of other 

1817 lists are ignored in favour of the larger lists. The remaining lists are 

1818 then attempted in turn as "easy" hints. 

1819 

1820 We also try to auto hint circular dependencies analyzing the update 

1821 excuses relationships. If they build a circular dependency, which we already 

1822 know as not-working with the standard do_all algorithm, try to `easy` them. 

1823 """ 

1824 self.logger.info("> Processing hints from the auto hinter") 

1825 

1826 sources_t = self.suite_info.target_suite.sources 

1827 excuses = self.excuses 

1828 

1829 def excuse_still_valid(excuse: "Excuse") -> bool: 

1830 source = excuse.source 

1831 assert isinstance(excuse.item, MigrationItem) 

1832 arch = excuse.item.architecture 

1833 # TODO for binNMUs, this check is always ok, even if the item 

1834 # migrated already 

1835 valid = ( 

1836 arch != "source" 

1837 or source not in sources_t 

1838 or sources_t[source].version != excuse.ver[1] 

1839 ) 

1840 # TODO migrated items should be removed from upgrade_me, so this 

1841 # should not happen 

1842 if not valid: 1842 ↛ 1843line 1842 didn't jump to line 1843, because the condition on line 1842 was never true

1843 raise AssertionError("excuse no longer valid %s" % (item)) 

1844 return valid 

1845 

1846 # consider only excuses which are valid candidates and still relevant. 

1847 valid_excuses = frozenset( 

1848 e.name 

1849 for n, e in excuses.items() 

1850 if e.item in upgrade_me and excuse_still_valid(e) 

1851 ) 

1852 excuses_deps = { 

1853 name: valid_excuses.intersection(excuse.get_deps()) 

1854 for name, excuse in excuses.items() 

1855 if name in valid_excuses 

1856 } 

1857 excuses_rdeps = defaultdict(set) 

1858 for name, deps in excuses_deps.items(): 

1859 for dep in deps: 

1860 excuses_rdeps[dep].add(name) 

1861 

1862 # loop on them 

1863 candidates = [] 

1864 mincands = [] 

1865 seen_hints = set() 

1866 for e in valid_excuses: 

1867 excuse = excuses[e] 

1868 if not excuse.get_deps(): 

1869 assert isinstance(excuse.item, MigrationItem) 

1870 items = [excuse.item] 

1871 orig_size = 1 

1872 looped = False 

1873 seen_items = set() 

1874 seen_items.update(items) 

1875 

1876 for item in items: 

1877 assert isinstance(item, MigrationItem) 

1878 # excuses which depend on "item" or are depended on by it 

1879 new_items = cast( 

1880 set[MigrationItem], 

1881 { 

1882 excuses[x].item 

1883 for x in chain( 

1884 excuses_deps[item.name], excuses_rdeps[item.name] 

1885 ) 

1886 }, 

1887 ) 

1888 new_items -= seen_items 

1889 items.extend(new_items) 

1890 seen_items.update(new_items) 

1891 

1892 if not looped and len(items) > 1: 

1893 orig_size = len(items) 

1894 h = frozenset(seen_items) 

1895 if h not in seen_hints: 1895 ↛ 1898line 1895 didn't jump to line 1898, because the condition on line 1895 was never false

1896 mincands.append(h) 

1897 seen_hints.add(h) 

1898 looped = True 

1899 if len(items) != orig_size: 1899 ↛ 1900line 1899 didn't jump to line 1900, because the condition on line 1899 was never true

1900 h = frozenset(seen_items) 

1901 if h != mincands[-1] and h not in seen_hints: 

1902 candidates.append(h) 

1903 seen_hints.add(h) 

1904 return [candidates, mincands] 

1905 

1906 def run_auto_hinter(self) -> None: 

1907 for lst in self.get_auto_hinter_hints(self.upgrade_me): 

1908 for hint in lst: 

1909 self.do_hint("easy", "autohinter", sorted(hint)) 

1910 

1911 def nuninst_arch_report(self, nuninst: dict[str, set[str]], arch: str) -> None: 

1912 """Print a report of uninstallable packages for one architecture.""" 

1913 all = defaultdict(set) 

1914 binaries_t = self.suite_info.target_suite.binaries 

1915 for p in nuninst[arch]: 

1916 pkg = binaries_t[arch][p] 

1917 all[(pkg.source, pkg.source_version)].add(p) 

1918 

1919 print("* %s" % arch) 

1920 

1921 for (src, ver), pkgs in sorted(all.items()): 

1922 print(" %s (%s): %s" % (src, ver, " ".join(sorted(pkgs)))) 

1923 

1924 print() 

1925 

1926 def _remove_archall_faux_packages(self) -> None: 

1927 """Remove faux packages added for the excuses phase 

1928 

1929 To prevent binary packages from going missing while they are listed by 

1930 their source package we add bin:faux packages during reading in the 

1931 Sources. They are used during the excuses phase to prevent packages 

1932 from becoming candidates. However, they interfere in complex ways 

1933 during the installability phase, so instead of having all code during 

1934 migration be aware of this excuses phase implementation detail, let's 

1935 remove them again. 

1936 

1937 """ 

1938 if not self.options.archall_inconsistency_allowed: 

1939 all_binaries = self.all_binaries 

1940 faux_a = {x for x in all_binaries.keys() if x[2] == "faux"} 

1941 for pkg_a in faux_a: 

1942 del all_binaries[pkg_a] 

1943 

1944 for suite in self.suite_info._suites.values(): 

1945 for arch in suite.binaries.keys(): 

1946 binaries = suite.binaries[arch] 

1947 faux_b = {x for x in binaries if binaries[x].pkg_id[2] == "faux"} 

1948 for pkg_b in faux_b: 

1949 del binaries[pkg_b] 

1950 sources = suite.sources 

1951 for src in sources.keys(): 

1952 faux_s = {x for x in sources[src].binaries if x[2] == "faux"} 

1953 sources[src].binaries -= faux_s 

1954 

1955 def main(self) -> None: 

1956 """Main method 

1957 

1958 This is the entry point for the class: it includes the list of calls 

1959 for the member methods which will produce the output files. 

1960 """ 

1961 # if running in --print-uninst mode, quit 

1962 if self.options.print_uninst: 1962 ↛ 1963line 1962 didn't jump to line 1963, because the condition on line 1962 was never true

1963 return 

1964 # if no actions are provided, build the excuses and sort them 

1965 elif not self.options.actions: 1965 ↛ 1969line 1965 didn't jump to line 1969, because the condition on line 1965 was never false

1966 self.write_excuses() 

1967 # otherwise, use the actions provided by the command line 

1968 else: 

1969 self.upgrade_me = self.options.actions.split() 

1970 

1971 self._remove_archall_faux_packages() 

1972 

1973 if self.options.compute_migrations or self.options.hint_tester: 

1974 if self.options.dry_run: 1974 ↛ 1975line 1974 didn't jump to line 1975, because the condition on line 1974 was never true

1975 self.logger.info( 

1976 "Upgrade output not (also) written to a separate file" 

1977 " as this is a dry-run." 

1978 ) 

1979 elif hasattr(self.options, "upgrade_output"): 1979 ↛ 1989line 1979 didn't jump to line 1989, because the condition on line 1979 was never false

1980 upgrade_output = getattr(self.options, "upgrade_output") 

1981 file_handler = logging.FileHandler( 

1982 upgrade_output, mode="w", encoding="utf-8" 

1983 ) 

1984 output_formatter = logging.Formatter("%(message)s") 

1985 file_handler.setFormatter(output_formatter) 

1986 self.output_logger.addHandler(file_handler) 

1987 self.logger.info("Logging upgrade output to %s", upgrade_output) 

1988 else: 

1989 self.logger.info( 

1990 "Upgrade output not (also) written to a separate file" 

1991 " as the UPGRADE_OUTPUT configuration is not provided." 

1992 ) 

1993 

1994 # run the hint tester 

1995 if self.options.hint_tester: 1995 ↛ 1996line 1995 didn't jump to line 1996, because the condition on line 1995 was never true

1996 self.hint_tester() 

1997 # run the upgrade test 

1998 else: 

1999 self.upgrade_testing() 

2000 

2001 self.logger.info("> Stats from the installability tester") 

2002 for stat in self._inst_tester.stats.stats(): 

2003 self.logger.info("> %s", stat) 

2004 else: 

2005 self.logger.info("Migration computation skipped as requested.") 

2006 if not self.options.dry_run: 2006 ↛ 2008line 2006 didn't jump to line 2008, because the condition on line 2006 was never false

2007 self._policy_engine.save_state(self) 

2008 logging.shutdown() 

2009 

2010 

2011if __name__ == "__main__": 2011 ↛ 2012line 2011 didn't jump to line 2012, because the condition on line 2011 was never true

2012 Britney().main()