diff options
author | Andrew Geissler <geissonator@yahoo.com> | 2019-02-25 18:54:23 -0600 |
---|---|---|
committer | Andrew Geissler <geissonator@yahoo.com> | 2019-02-25 18:55:01 -0600 |
commit | 99467dab23c4af816958fdd98218ca613308b402 (patch) | |
tree | de31fa6e710794fb8435279b8cc7f48dbe241f26 /poky/scripts | |
parent | 0c13e4cf5913a901598c0c13ba172ce6e5a7b4f6 (diff) | |
download | talos-openbmc-99467dab23c4af816958fdd98218ca613308b402.tar.gz talos-openbmc-99467dab23c4af816958fdd98218ca613308b402.zip |
poky: refresh thud: b904775c2b..7c76c5d78b
Update poky to thud HEAD.
Adam Trhon (1):
icecc-env: don't raise error when icecc not installed
Alexander Kanavin (1):
openssl10: update to 1.0.2q
Armin Kuster (1):
perl: add testdepends for ssh
Bruce Ashfield (2):
linux-yocto/4.18: update to v4.18.26
linux-yocto/4.18: update to v4.18.27
Changqing Li (1):
checklayer: generate locked-sigs.inc under builddir
Dan Dedrick (2):
devtool: remove duplicate overrides
devtool: improve git repo checks before check_commits logic
Daniel Ammann (1):
ref-manual: Typo found and fixed.
Douglas Royds (2):
openssl ptest: Strip build host paths from configdata.pm
openssl: Strip perl version from installed ptest configdata.pm file
Dustin Bain (1):
busybox: update to 1.29.3
Jan Kiszka (1):
oe-git-proxy: Avoid resolving NO_PROXY against local files
Jens Rehsack (1):
avahi: avoid depending on skipped package
Jonas Bonn (1):
keymaps: tighten package write dependency
Kai Kang (1):
selftest/wic: update test case test_qemu
Khem Raj (3):
openssl10: Fix mutliple include assumptions for bn.h in opensslconf.h
send-error-report: Use https instead of http protocol
multilib_header_wrapper.h: Use #pragma once
Leonardo Augusto (1):
scripts/lib/wic/engine: Fix cp's target path for ext* filesystems
Liu Haitao (1):
iw: fix parsing of WEP keys
Mingli Yu (1):
logrotate.py: restore /etc/logrotate.d/wtmp
Otavio Salvador (1):
linux-firmware: Bump to 710963f revision
Ovidiu Panait (1):
ghostscript: Fix CVE-2019-6116
Peter Kjellerstedt (1):
libaio: Extend to native
Richard Purdie (23):
package: Add pkg_postinst_ontarget to PACKAGEVARS
oeqa/runtime/ptest: Avoid traceback for tests with no section
oeqa/utils/logparser: Simplify ptest log parsing code
oeqa/logparser: Further simplification/clarification
oeqa/logparser: Reform the ptest results parser
oeqa/utils/logparser: Add in support for duration, exitcode and logs by section
oeqa/logparser: Improve results handling
oeqa/logparser: Various misc cleanups
oeqa/runtime/ptest: Ensure OOM errors are logged
scripts/contrib/build-perf-test-wrapper.sh: Improve interaction with autobuilder automation
scripts/contrib/build-perf-test.sh: Remove it
oe-build-perf-report: Allow branch without hostname
oe-build-perf-report: Allow commits from different branches
oe-build-perf-report: Improve branch comparision handling
oe-build-perf-report: Fix missing buildstats comparisions
wic/engine: Fix missing parted autobuilder failures
lib/buildstats: Improve error message
scripts/oe-git-archive: Separate out functionality to library function
oe-build-perf-report/gitarchive: Move common useful functions to library
bitbake: runqueue: Fix dependency loop analysis 'hangs'
bitbake: runqueue: Filter out multiconfig dependencies from BB_TASKDEPDATA
bitbake: siggen: Fix multiconfig corner case
bitbake: cooker: Tweak multiconfig dependency resolution
Robert Yang (5):
bluez5: Fix a race issue for tools
yocto-check-layer-wrapper: Fix path for oe-init-build-env
checklayer: Avoid adding the layer if it is already present
runqemu: Let qemuparams override default settings
runqemu: Make QB_MEM easier to set
Ross Burton (3):
e2fsprogs: fix file system generation with large files
linux-firmware: recommend split up packages
linux-firmware: split out liquidio firmware
Scott Rifenbark (2):
poky.ent: Updated "meta-intel" version to "10.1"
overview-manual, mega-manual: Updated Package Feeds diagram
Serhey Popovych (1):
openssl: Skip assembler optimized code for powerpc64 with musl
William Bourque (1):
wic/engine.py: Load paths from PATH environment variable
Xulin Sun (1):
openssl: fix multilib file install conflicts
Zheng Ruoqin (1):
mdadm: add init and service scripts
Change-Id: Ib14c2fb69d25d84aa3d4bf0a6715bba57d1eb900
Signed-off-by: Andrew Geissler <geissonator@yahoo.com>
Diffstat (limited to 'poky/scripts')
-rwxr-xr-x | poky/scripts/contrib/build-perf-test-wrapper.sh | 37 | ||||
-rwxr-xr-x | poky/scripts/contrib/build-perf-test.sh | 400 | ||||
-rw-r--r-- | poky/scripts/lib/buildstats.py | 2 | ||||
-rw-r--r-- | poky/scripts/lib/checklayer/__init__.py | 42 | ||||
-rw-r--r-- | poky/scripts/lib/devtool/standard.py | 11 | ||||
-rw-r--r-- | poky/scripts/lib/wic/engine.py | 13 | ||||
-rw-r--r-- | poky/scripts/multilib_header_wrapper.h | 1 | ||||
-rwxr-xr-x | poky/scripts/oe-build-perf-report | 124 | ||||
-rwxr-xr-x | poky/scripts/oe-git-archive | 166 | ||||
-rwxr-xr-x | poky/scripts/oe-git-proxy | 4 | ||||
-rwxr-xr-x | poky/scripts/runqemu | 24 | ||||
-rwxr-xr-x | poky/scripts/send-error-report | 6 | ||||
-rwxr-xr-x | poky/scripts/yocto-check-layer | 6 | ||||
-rwxr-xr-x | poky/scripts/yocto-check-layer-wrapper | 4 |
14 files changed, 150 insertions, 690 deletions
diff --git a/poky/scripts/contrib/build-perf-test-wrapper.sh b/poky/scripts/contrib/build-perf-test-wrapper.sh index 19bee1dd0..7cbb5d794 100755 --- a/poky/scripts/contrib/build-perf-test-wrapper.sh +++ b/poky/scripts/contrib/build-perf-test-wrapper.sh @@ -33,7 +33,9 @@ Optional arguments: -c COMMITISH test (checkout) this commit, <branch>:<commit> can be specified to test specific commit of certain branch -C GIT_REPO commit results into Git + -d DOWNLOAD_DIR directory to store downloaded sources in -E EMAIL_ADDR send email report + -g GLOBALRES_DIR where to place the globalres file -P GIT_REMOTE push results to a remote Git repository -R DEST rsync reports to a remote destination -w WORK_DIR work dir for this script @@ -51,19 +53,26 @@ get_os_release_var () { commitish="" oe_build_perf_test_extra_opts=() oe_git_archive_extra_opts=() -while getopts "ha:c:C:E:P:R:w:x" opt; do +while getopts "ha:c:C:d:E:g:P:R:w:x" opt; do case $opt in h) usage exit 0 ;; - a) archive_dir=`realpath -s "$OPTARG"` + a) mkdir -p "$OPTARG" + archive_dir=`realpath -s "$OPTARG"` ;; c) commitish=$OPTARG ;; - C) results_repo=`realpath -s "$OPTARG"` + C) mkdir -p "$OPTARG" + results_repo=`realpath -s "$OPTARG"` + ;; + d) download_dir=`realpath -s "$OPTARG"` ;; E) email_to="$OPTARG" ;; + g) mkdir -p "$OPTARG" + globalres_dir=`realpath -s "$OPTARG"` + ;; P) oe_git_archive_extra_opts+=("--push" "$OPTARG") ;; R) rsync_dst="$OPTARG" @@ -86,6 +95,17 @@ if [ $# -ne 0 ]; then exit 1 fi +if [ -n "$email_to" ]; then + if ! [ -x "$(command -v phantomjs)" ]; then + echo "ERROR: Sending email needs phantomjs." + exit 1 + fi + if ! [ -x "$(command -v optipng)" ]; then + echo "ERROR: Sending email needs optipng." + exit 1 + fi +fi + # Open a file descriptor for flock and acquire lock LOCK_FILE="/tmp/oe-build-perf-test-wrapper.lock" if ! exec 3> "$LOCK_FILE"; then @@ -146,11 +166,18 @@ if [ -z "$base_dir" ]; then fi echo "Using working dir $base_dir" +if [ -z "$download_dir" ]; then + download_dir="$base_dir/downloads" +fi +if [ -z "$globalres_dir" ]; then + globalres_dir="$base_dir" +fi + timestamp=`date "+%Y%m%d%H%M%S"` git_rev=$(git rev-parse --short HEAD) || exit 1 build_dir="$base_dir/build-$git_rev-$timestamp" results_dir="$base_dir/results-$git_rev-$timestamp" -globalres_log="$base_dir/globalres.log" +globalres_log="$globalres_dir/globalres.log" machine="qemux86" mkdir -p "$base_dir" @@ -161,7 +188,7 @@ auto_conf="$build_dir/conf/auto.conf" echo "MACHINE = \"$machine\"" > "$auto_conf" echo 'BB_NUMBER_THREADS = "8"' >> "$auto_conf" echo 'PARALLEL_MAKE = "-j 8"' >> "$auto_conf" -echo "DL_DIR = \"$base_dir/downloads\"" >> "$auto_conf" +echo "DL_DIR = \"$download_dir\"" >> "$auto_conf" # Disabling network sanity check slightly reduces the variance of timing results echo 'CONNECTIVITY_CHECK_URIS = ""' >> "$auto_conf" # Possibility to define extra settings diff --git a/poky/scripts/contrib/build-perf-test.sh b/poky/scripts/contrib/build-perf-test.sh deleted file mode 100755 index 9a091edb0..000000000 --- a/poky/scripts/contrib/build-perf-test.sh +++ /dev/null @@ -1,400 +0,0 @@ -#!/bin/bash -# -# This script runs a series of tests (with and without sstate) and reports build time (and tmp/ size) -# -# Build performance test script -# -# Copyright 2013 Intel Corporation -# -# This program is free software; you can redistribute it and/or modify -# it under the terms of the GNU General Public License as published by -# the Free Software Foundation; either version 2 of the License, or -# (at your option) any later version. -# -# This program is distributed in the hope that it will be useful, -# but WITHOUT ANY WARRANTY; without even the implied warranty of -# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the -# GNU General Public License for more details. -# -# You should have received a copy of the GNU General Public License -# along with this program; if not, write to the Free Software -# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA -# -# -# AUTHORS: -# Stefan Stanacar <stefanx.stanacar@intel.com> - - -ME=$(basename $0) - -# -# usage and setup -# - -usage () { -cat << EOT -Usage: $ME [-h] - $ME [-c <commit>] [-v] [-m <val>] [-j <val>] [-t <val>] [-i <image-name>] [-d <path>] -Options: - -h - Display this help and exit. - -c <commit> - git checkout <commit> before anything else - -v - Show bitbake output, don't redirect it to a log. - -m <machine> - Value for MACHINE. Default is qemux86. - -j <val> - Value for PARALLEL_MAKE. Default is 8. - -t <val> - Value for BB_NUMBER_THREADS. Default is 8. - -i <image-name> - Instead of timing against core-image-sato, use <image-name> - -d <path> - Use <path> as DL_DIR - -p <githash> - Cherry pick githash onto the commit - -Note: current working directory must be inside a poky git clone. - -EOT -} - - -if clonedir=$(git rev-parse --show-toplevel); then - cd $clonedir -else - echo "The current working dir doesn't seem to be a poky git clone. Please cd there before running $ME" - exit 1 -fi - -IMAGE="core-image-sato" -verbose=0 -dldir= -commit= -pmake= -cherrypicks= -while getopts "hvc:m:j:t:i:d:p:" opt; do - case $opt in - h) usage - exit 0 - ;; - v) verbose=1 - ;; - c) commit=$OPTARG - ;; - m) export MACHINE=$OPTARG - ;; - j) pmake=$OPTARG - ;; - t) export BB_NUMBER_THREADS=$OPTARG - ;; - i) IMAGE=$OPTARG - ;; - d) dldir=$OPTARG - ;; - p) cherrypicks="$cherrypicks $OPTARG" - ;; - *) usage - exit 1 - ;; - esac -done - - -#drop cached credentials and test for sudo access without a password -sudo -k -n ls > /dev/null 2>&1 -reqpass=$? -if [ $reqpass -ne 0 ]; then - echo "The script requires sudo access to drop caches between builds (echo 3 > /proc/sys/vm/drop_caches)" - read -s -p "Please enter your sudo password: " pass - echo -fi - -if [ -n "$commit" ]; then - echo "git checkout -f $commit" - git pull > /dev/null 2>&1 - git checkout -f $commit || exit 1 - git pull > /dev/null 2>&1 -fi - -if [ -n "$cherrypicks" ]; then - for c in $cherrypicks; do - git cherry-pick $c - done -fi - -rev=$(git rev-parse --short HEAD) || exit 1 -OUTDIR="$clonedir/build-perf-test/results-$rev-`date "+%Y%m%d%H%M%S"`" -BUILDDIR="$OUTDIR/build" -resultsfile="$OUTDIR/results.log" -cmdoutput="$OUTDIR/commands.log" -myoutput="$OUTDIR/output.log" -globalres="$clonedir/build-perf-test/globalres.log" - -mkdir -p $OUTDIR || exit 1 - -log () { - local msg="$1" - echo "`date`: $msg" | tee -a $myoutput -} - - -# -# Config stuff -# - -branch=`git branch 2>&1 | grep "^* " | tr -d "* "` -gitcommit=$(git rev-parse HEAD) || exit 1 -log "Running on $branch:$gitcommit" - -source ./oe-init-build-env $OUTDIR/build >/dev/null || exit 1 -cd $OUTDIR/build - -[ -n "$MACHINE" ] || export MACHINE="qemux86" -[ -n "$BB_NUMBER_THREADS" ] || export BB_NUMBER_THREADS="8" - -if [ -n "$pmake" ]; then - export PARALLEL_MAKE="-j $pmake" -else - export PARALLEL_MAKE="-j 8" -fi - -if [ -n "$dldir" ]; then - echo "DL_DIR = \"$dldir\"" >> conf/local.conf -else - echo "DL_DIR = \"$clonedir/build-perf-test/downloads\"" >> conf/local.conf -fi - -# Sometimes I've noticed big differences in timings for the same commit, on the same machine -# Disabling the network sanity check helps a bit (because of my crappy network connection and/or proxy) -echo "CONNECTIVITY_CHECK_URIS =\"\"" >> conf/local.conf - - -# -# Functions -# - -declare -a TIMES -time_count=0 -declare -a SIZES -size_count=0 - -time_cmd () { - log " Timing: $*" - - if [ $verbose -eq 0 ]; then - /usr/bin/time -v -o $resultsfile "$@" >> $cmdoutput - else - /usr/bin/time -v -o $resultsfile "$@" - fi - ret=$? - if [ $ret -eq 0 ]; then - t=`grep wall $resultsfile | sed 's/.*m:ss): //'` - log " TIME: $t" - TIMES[(( time_count++ ))]="$t" - else - log "ERROR: exit status was non-zero, will report time as 0." - TIMES[(( time_count++ ))]="0" - fi - - #time by default overwrites the output file and we want to keep the results - #it has an append option but I don't want to clobber the results in the same file - i=`ls $OUTDIR/results.log* |wc -l` - mv $resultsfile "${resultsfile}.${i}" - log "More stats can be found in ${resultsfile}.${i}" -} - -bbtime () { - time_cmd bitbake "$@" -} - -#we don't time bitbake here -bbnotime () { - local arg="$@" - log " Running: bitbake ${arg}" - if [ $verbose -eq 0 ]; then - bitbake ${arg} >> $cmdoutput - else - bitbake ${arg} - fi - ret=$? - if [ $ret -eq 0 ]; then - log " Finished bitbake ${arg}" - else - log "ERROR: exit status was non-zero. Exit.." - exit $ret - fi - -} - -do_rmtmp() { - log " Removing tmp" - rm -rf bitbake.lock pseudodone conf/sanity_info cache tmp -} -do_rmsstate () { - log " Removing sstate-cache" - rm -rf sstate-cache -} -do_sync () { - log " Syncing and dropping caches" - sync; sync - if [ $reqpass -eq 0 ]; then - sudo sh -c "echo 3 > /proc/sys/vm/drop_caches" - else - echo "$pass" | sudo -S sh -c "echo 3 > /proc/sys/vm/drop_caches" - echo - fi - sleep 3 -} - -write_results() { - echo -n "`uname -n`,$branch:$gitcommit,`git describe`," >> $globalres - for i in "${TIMES[@]}"; do - echo -n "$i," >> $globalres - done - for i in "${SIZES[@]}"; do - echo -n "$i," >> $globalres - done - echo >> $globalres - sed -i '$ s/,$//' $globalres -} - -#### - -# -# Test 1 -# Measure: Wall clock of "bitbake core-image-sato" and size of tmp/dir (w/o rm_work and w/ rm_work) -# Pre: Downloaded sources, no sstate -# Steps: -# Part1: -# - fetchall -# - clean build dir -# - time bitbake core-image-sato -# - collect data -# Part2: -# - bitbake virtual/kernel -c cleansstate -# - time bitbake virtual/kernel -# Part3: -# - add INHERIT to local.conf -# - clean build dir -# - build -# - report size, remove INHERIT - -test1_p1 () { - log "Running Test 1, part 1/3: Measure wall clock of bitbake $IMAGE and size of tmp/ dir" - bbnotime $IMAGE --runall=fetch - do_rmtmp - do_rmsstate - do_sync - bbtime $IMAGE - s=`du -s tmp | sed 's/tmp//' | sed 's/[ \t]*$//'` - SIZES[(( size_count++ ))]="$s" - log "SIZE of tmp dir is: $s" - log "Buildstats are saved in $OUTDIR/buildstats-test1" - mv tmp/buildstats $OUTDIR/buildstats-test1 -} - - -test1_p2 () { - log "Running Test 1, part 2/3: bitbake virtual/kernel -c cleansstate and time bitbake virtual/kernel" - bbnotime virtual/kernel -c cleansstate - do_sync - bbtime virtual/kernel -} - -test1_p3 () { - log "Running Test 1, part 3/3: Build $IMAGE w/o sstate and report size of tmp/dir with rm_work enabled" - echo "INHERIT += \"rm_work\"" >> conf/local.conf - do_rmtmp - do_rmsstate - do_sync - bbtime $IMAGE - sed -i 's/INHERIT += \"rm_work\"//' conf/local.conf - s=`du -s tmp | sed 's/tmp//' | sed 's/[ \t]*$//'` - SIZES[(( size_count++ ))]="$s" - log "SIZE of tmp dir is: $s" - log "Buildstats are saved in $OUTDIR/buildstats-test13" - mv tmp/buildstats $OUTDIR/buildstats-test13 -} - - -# -# Test 2 -# Measure: Wall clock of "bitbake core-image-sato" and size of tmp/dir -# Pre: populated sstate cache - -test2 () { - # Assuming test 1 has run - log "Running Test 2: Measure wall clock of bitbake $IMAGE -c rootfs with sstate" - do_rmtmp - do_sync - bbtime $IMAGE -c rootfs -} - - -# Test 3 -# parsing time metrics -# -# Start with -# i) "rm -rf tmp/cache; time bitbake -p" -# ii) "rm -rf tmp/cache/default-glibc/; time bitbake -p" -# iii) "time bitbake -p" - - -test3 () { - log "Running Test 3: Parsing time metrics (bitbake -p)" - log " Removing tmp/cache && cache" - rm -rf tmp/cache cache - bbtime -p - log " Removing tmp/cache/default-glibc/" - rm -rf tmp/cache/default-glibc/ - bbtime -p - bbtime -p -} - -# -# Test 4 - eSDK -# Measure: eSDK size and installation time -test4 () { - log "Running Test 4: eSDK size and installation time" - bbnotime $IMAGE -c do_populate_sdk_ext - - esdk_installer=(tmp/deploy/sdk/*-toolchain-ext-*.sh) - - if [ ${#esdk_installer[*]} -eq 1 ]; then - s=$((`stat -c %s "$esdk_installer"` / 1024)) - SIZES[(( size_count++ ))]="$s" - log "Download SIZE of eSDK is: $s kB" - - do_sync - time_cmd "$esdk_installer" -y -d "tmp/esdk-deploy" - - s=$((`du -sb "tmp/esdk-deploy" | cut -f1` / 1024)) - SIZES[(( size_count++ ))]="$s" - log "Install SIZE of eSDK is: $s kB" - else - log "ERROR: other than one sdk found (${esdk_installer[*]}), reporting size and time as 0." - SIZES[(( size_count++ ))]="0" - TIMES[(( time_count++ ))]="0" - fi - -} - - -# RUN! - -test1_p1 -test1_p2 -test1_p3 -test2 -test3 -test4 - -# if we got til here write to global results -write_results - -log "All done, cleaning up..." - -do_rmtmp -do_rmsstate diff --git a/poky/scripts/lib/buildstats.py b/poky/scripts/lib/buildstats.py index d9aadf3cb..f7db3eaf9 100644 --- a/poky/scripts/lib/buildstats.py +++ b/poky/scripts/lib/buildstats.py @@ -263,7 +263,7 @@ class BuildStats(dict): """Aggregate other buildstats into this""" if set(self.keys()) != set(buildstats.keys()): raise ValueError("Refusing to aggregate buildstats, set of " - "recipes is different") + "recipes is different: %s" % (set(self.keys()) ^ set(buildstats.keys()))) for pkg, data in buildstats.items(): self[pkg].aggregate(data) diff --git a/poky/scripts/lib/checklayer/__init__.py b/poky/scripts/lib/checklayer/__init__.py index 778804184..670f0eea3 100644 --- a/poky/scripts/lib/checklayer/__init__.py +++ b/poky/scripts/lib/checklayer/__init__.py @@ -196,38 +196,36 @@ def add_layer_dependencies(bblayersconf, layer, layers, logger): if layer_depends is None: return False else: - # Don't add a layer that is already present. - added = set() - output = check_command('Getting existing layers failed.', 'bitbake-layers show-layers').decode('utf-8') - for layer, path, pri in re.findall(r'^(\S+) +([^\n]*?) +(\d+)$', output, re.MULTILINE): - added.add(path) - - for layer_depend in layer_depends: - name = layer_depend['name'] - path = layer_depend['path'] + add_layers(bblayersconf, layer_depends, logger) + + return True + +def add_layers(bblayersconf, layers, logger): + # Don't add a layer that is already present. + added = set() + output = check_command('Getting existing layers failed.', 'bitbake-layers show-layers').decode('utf-8') + for layer, path, pri in re.findall(r'^(\S+) +([^\n]*?) +(\d+)$', output, re.MULTILINE): + added.add(path) + + with open(bblayersconf, 'a+') as f: + for layer in layers: + logger.info('Adding layer %s' % layer['name']) + name = layer['name'] + path = layer['path'] if path in added: - continue + logger.info('%s is already in %s' % (name, bblayersconf)) else: added.add(path) - logger.info('Adding layer dependency %s' % name) - with open(bblayersconf, 'a+') as f: f.write("\nBBLAYERS += \"%s\"\n" % path) return True -def add_layer(bblayersconf, layer, layers, logger): - logger.info('Adding layer %s' % layer['name']) - with open(bblayersconf, 'a+') as f: - f.write("\nBBLAYERS += \"%s\"\n" % layer['path']) - - return True - -def check_command(error_msg, cmd): +def check_command(error_msg, cmd, cwd=None): ''' Run a command under a shell, capture stdout and stderr in a single stream, throw an error when command returns non-zero exit code. Returns the output. ''' - p = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT) + p = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, cwd=cwd) output, _ = p.communicate() if p.returncode: msg = "%s\nCommand: %s\nOutput:\n%s" % (error_msg, cmd, output.decode('utf-8')) @@ -257,7 +255,7 @@ def get_signatures(builddir, failsafe=False, machine=None): os.unlink(sigs_file) try: check_command('Generating signatures failed. This might be due to some parse error and/or general layer incompatibilities.', - cmd) + cmd, builddir) except RuntimeError as ex: if failsafe and os.path.exists(sigs_file): # Ignore the error here. Most likely some recipes active diff --git a/poky/scripts/lib/devtool/standard.py b/poky/scripts/lib/devtool/standard.py index d14b7a654..b7d4d47df 100644 --- a/poky/scripts/lib/devtool/standard.py +++ b/poky/scripts/lib/devtool/standard.py @@ -509,6 +509,11 @@ def _extract_source(srctree, keep_temp, devbranch, sync, config, basepath, works if not 'flag' in event: if event['op'].startswith(('_append[', '_prepend[')): extra_overrides.append(event['op'].split('[')[1].split(']')[0]) + # We want to remove duplicate overrides. If a recipe had multiple + # SRC_URI_override += values it would cause mulitple instances of + # overrides. This doesn't play nicely with things like creating a + # branch for every instance of DEVTOOL_EXTRA_OVERRIDES. + extra_overrides = list(set(extra_overrides)) if extra_overrides: logger.info('SRC_URI contains some conditional appends/prepends - will create branches to represent these') @@ -769,9 +774,13 @@ def modify(args, config, basepath, workspace): check_commits = True else: if os.path.exists(os.path.join(srctree, '.git')): - # Check if it's a tree previously extracted by us + # Check if it's a tree previously extracted by us. This is done + # by ensuring that devtool-base and args.branch (devtool) exist. + # The check_commits logic will cause an exception if either one + # of these doesn't exist try: (stdout, _) = bb.process.run('git branch --contains devtool-base', cwd=srctree) + bb.process.run('git rev-parse %s' % args.branch, cwd=srctree) except bb.process.ExecutionError: stdout = '' if stdout: diff --git a/poky/scripts/lib/wic/engine.py b/poky/scripts/lib/wic/engine.py index 4662c665c..ea600d285 100644 --- a/poky/scripts/lib/wic/engine.py +++ b/poky/scripts/lib/wic/engine.py @@ -245,9 +245,16 @@ class Disk: self._ptable_format = None # find parted - self.paths = "/bin:/usr/bin:/usr/sbin:/sbin/" + # read paths from $PATH environment variable + # if it fails, use hardcoded paths + pathlist = "/bin:/usr/bin:/usr/sbin:/sbin/" + try: + self.paths = os.environ['PATH'] + ":" + pathlist + except KeyError: + self.paths = pathlist + if native_sysroot: - for path in self.paths.split(':'): + for path in pathlist.split(':'): self.paths = "%s%s:%s" % (native_sysroot, path, self.paths) self.parted = find_executable("parted", self.paths) @@ -331,7 +338,7 @@ class Disk: def copy(self, src, pnum, path): """Copy partition image into wic image.""" if self.partitions[pnum].fstype.startswith('ext'): - cmd = "echo -e 'cd {}\nwrite {} {}' | {} -w {}".\ + cmd = "printf 'cd {}\nwrite {} {}' | {} -w {}".\ format(path, src, os.path.basename(src), self.debugfs, self._get_part_image(pnum)) else: # fat diff --git a/poky/scripts/multilib_header_wrapper.h b/poky/scripts/multilib_header_wrapper.h index 9660225fd..482479078 100644 --- a/poky/scripts/multilib_header_wrapper.h +++ b/poky/scripts/multilib_header_wrapper.h @@ -21,6 +21,7 @@ * */ +#pragma once #if defined (__bpf__) #define __MHWORDSIZE 64 diff --git a/poky/scripts/oe-build-perf-report b/poky/scripts/oe-build-perf-report index 0bd05f44e..f6fb458c2 100755 --- a/poky/scripts/oe-build-perf-report +++ b/poky/scripts/oe-build-perf-report @@ -37,58 +37,18 @@ from buildstats import BuildStats, diff_buildstats, BSVerDiff scriptpath.add_oe_lib_path() from oeqa.utils.git import GitRepo, GitError +import oeqa.utils.gitarchive as gitarchive # Setup logging logging.basicConfig(level=logging.INFO, format="%(levelname)s: %(message)s") log = logging.getLogger('oe-build-perf-report') - -# Container class for tester revisions -TestedRev = namedtuple('TestedRev', 'commit commit_number tags') - - -def get_test_runs(repo, tag_name, **kwargs): - """Get a sorted list of test runs, matching given pattern""" - # First, get field names from the tag name pattern - field_names = [m.group(1) for m in re.finditer(r'{(\w+)}', tag_name)] - undef_fields = [f for f in field_names if f not in kwargs.keys()] - - # Fields for formatting tag name pattern - str_fields = dict([(f, '*') for f in field_names]) - str_fields.update(kwargs) - - # Get a list of all matching tags - tag_pattern = tag_name.format(**str_fields) - tags = repo.run_cmd(['tag', '-l', tag_pattern]).splitlines() - log.debug("Found %d tags matching pattern '%s'", len(tags), tag_pattern) - - # Parse undefined fields from tag names - str_fields = dict([(f, r'(?P<{}>[\w\-.()]+)'.format(f)) for f in field_names]) - str_fields['branch'] = r'(?P<branch>[\w\-.()/]+)' - str_fields['commit'] = '(?P<commit>[0-9a-f]{7,40})' - str_fields['commit_number'] = '(?P<commit_number>[0-9]{1,7})' - str_fields['tag_number'] = '(?P<tag_number>[0-9]{1,5})' - # escape parenthesis in fields in order to not messa up the regexp - fixed_fields = dict([(k, v.replace('(', r'\(').replace(')', r'\)')) for k, v in kwargs.items()]) - str_fields.update(fixed_fields) - tag_re = re.compile(tag_name.format(**str_fields)) - - # Parse fields from tags - revs = [] - for tag in tags: - m = tag_re.match(tag) - groups = m.groupdict() - revs.append([groups[f] for f in undef_fields] + [tag]) - - # Return field names and a sorted list of revs - return undef_fields, sorted(revs) - def list_test_revs(repo, tag_name, verbosity, **kwargs): """Get list of all tested revisions""" valid_kwargs = dict([(k, v) for k, v in kwargs.items() if v is not None]) - fields, revs = get_test_runs(repo, tag_name, **valid_kwargs) + fields, revs = gitarchive.get_test_runs(log, repo, tag_name, **valid_kwargs) ignore_fields = ['tag_number'] if verbosity < 2: extra_fields = ['COMMITS', 'TEST RUNS'] @@ -133,36 +93,6 @@ def list_test_revs(repo, tag_name, verbosity, **kwargs): print_table(rows) -def get_test_revs(repo, tag_name, **kwargs): - """Get list of all tested revisions""" - fields, runs = get_test_runs(repo, tag_name, **kwargs) - - revs = {} - commit_i = fields.index('commit') - commit_num_i = fields.index('commit_number') - for run in runs: - commit = run[commit_i] - commit_num = run[commit_num_i] - tag = run[-1] - if not commit in revs: - revs[commit] = TestedRev(commit, commit_num, [tag]) - else: - assert commit_num == revs[commit].commit_number, "Commit numbers do not match" - revs[commit].tags.append(tag) - - # Return in sorted table - revs = sorted(revs.values(), key=attrgetter('commit_number')) - log.debug("Found %d tested revisions:\n %s", len(revs), - "\n ".join(['{} ({})'.format(rev.commit_number, rev.commit) for rev in revs])) - return revs - -def rev_find(revs, attr, val): - """Search from a list of TestedRev""" - for i, rev in enumerate(revs): - if getattr(rev, attr) == val: - return i - raise ValueError("Unable to find '{}' value '{}'".format(attr, val)) - def is_xml_format(repo, commit): """Check if the commit contains xml (or json) data""" if repo.rev_parse(commit + ':results.xml'): @@ -427,8 +357,8 @@ def print_html_report(data, id_comp, buildstats): # Compare buildstats bs_key = test + '.' + meas - rev = metadata['commit_num']['value'] - comp_rev = metadata['commit_num']['value_old'] + rev = str(metadata['commit_num']['value']) + comp_rev = str(metadata['commit_num']['value_old']) if (rev in buildstats and bs_key in buildstats[rev] and comp_rev in buildstats and bs_key in buildstats[comp_rev]): new_meas['buildstats'] = BSSummary(buildstats[comp_rev][bs_key], @@ -512,10 +442,10 @@ def auto_args(repo, args): key = split[0] val = split[1].strip() - if key == 'hostname': + if key == 'hostname' and not args.hostname: log.debug("Using hostname %s", val) args.hostname = val - elif key == 'branch': + elif key == 'branch' and not args.branch: log.debug("Using branch %s", val) args.branch = val @@ -541,7 +471,8 @@ Examine build performance test results from a Git repository""" default='{hostname}/{branch}/{machine}/{commit_number}-g{commit}/{tag_number}', help="Tag name (pattern) for finding results") group.add_argument('--hostname', '-H') - group.add_argument('--branch', '-B', default='master') + group.add_argument('--branch', '-B', default='master', help="Branch to find commit in") + group.add_argument('--branch2', help="Branch to find comparision revisions in") group.add_argument('--machine', default='qemux86') group.add_argument('--history-length', default=25, type=int, help="Number of tested revisions to plot in html report") @@ -577,32 +508,51 @@ def main(argv=None): if not args.hostname: auto_args(repo, args) - revs = get_test_revs(repo, args.tag_name, hostname=args.hostname, - branch=args.branch, machine=args.machine) - if len(revs) < 2: - log.error("%d tester revisions found, unable to generate report", - len(revs)) - return 1 + revs = gitarchive.get_test_revs(log, repo, args.tag_name, hostname=args.hostname, + branch=args.branch, machine=args.machine) + if args.branch2: + revs2 = gitarchive.get_test_revs(log, repo, args.tag_name, hostname=args.hostname, + branch=args.branch2, machine=args.machine) + if not len(revs2): + log.error("No revisions found to compare against") + return 1 + if not len(revs): + log.error("No revision to report on found") + return 1 + else: + if len(revs) < 2: + log.error("Only %d tester revisions found, unable to generate report" % len(revs)) + return 1 # Pick revisions if args.commit: if args.commit_number: log.warning("Ignoring --commit-number as --commit was specified") - index1 = rev_find(revs, 'commit', args.commit) + index1 = gitarchive.rev_find(revs, 'commit', args.commit) elif args.commit_number: - index1 = rev_find(revs, 'commit_number', args.commit_number) + index1 = gitarchive.rev_find(revs, 'commit_number', args.commit_number) else: index1 = len(revs) - 1 + if args.branch2: + revs2.append(revs[index1]) + index1 = len(revs2) - 1 + revs = revs2 + if args.commit2: if args.commit_number2: log.warning("Ignoring --commit-number2 as --commit2 was specified") - index2 = rev_find(revs, 'commit', args.commit2) + index2 = gitarchive.rev_find(revs, 'commit', args.commit2) elif args.commit_number2: - index2 = rev_find(revs, 'commit_number', args.commit_number2) + index2 = gitarchive.rev_find(revs, 'commit_number', args.commit_number2) else: if index1 > 0: index2 = index1 - 1 + # Find the closest matching commit number for comparision + # In future we could check the commit is a common ancestor and + # continue back if not but this good enough for now + while index2 > 0 and revs[index2].commit_number > revs[index1].commit_number: + index2 = index2 - 1 else: log.error("Unable to determine the other commit, use " "--commit2 or --commit-number2 to specify it") diff --git a/poky/scripts/oe-git-archive b/poky/scripts/oe-git-archive index 913291a99..ab1c2b9ad 100755 --- a/poky/scripts/oe-git-archive +++ b/poky/scripts/oe-git-archive @@ -14,16 +14,10 @@ # more details. # import argparse -import glob -import json import logging -import math import os import re import sys -from collections import namedtuple, OrderedDict -from datetime import datetime, timedelta, tzinfo -from operator import attrgetter # Import oe and bitbake libs scripts_path = os.path.dirname(os.path.realpath(__file__)) @@ -34,128 +28,13 @@ scriptpath.add_oe_lib_path() from oeqa.utils.git import GitRepo, GitError from oeqa.utils.metadata import metadata_from_bb - +import oeqa.utils.gitarchive as gitarchive # Setup logging logging.basicConfig(level=logging.INFO, format="%(levelname)s: %(message)s") log = logging.getLogger() -class ArchiveError(Exception): - """Internal error handling of this script""" - - -def format_str(string, fields): - """Format string using the given fields (dict)""" - try: - return string.format(**fields) - except KeyError as err: - raise ArchiveError("Unable to expand string '{}': unknown field {} " - "(valid fields are: {})".format( - string, err, ', '.join(sorted(fields.keys())))) - - -def init_git_repo(path, no_create, bare): - """Initialize local Git repository""" - path = os.path.abspath(path) - if os.path.isfile(path): - raise ArchiveError("Invalid Git repo at {}: path exists but is not a " - "directory".format(path)) - if not os.path.isdir(path) or not os.listdir(path): - if no_create: - raise ArchiveError("No git repo at {}, refusing to create " - "one".format(path)) - if not os.path.isdir(path): - try: - os.mkdir(path) - except (FileNotFoundError, PermissionError) as err: - raise ArchiveError("Failed to mkdir {}: {}".format(path, err)) - if not os.listdir(path): - log.info("Initializing a new Git repo at %s", path) - repo = GitRepo.init(path, bare) - try: - repo = GitRepo(path, is_topdir=True) - except GitError: - raise ArchiveError("Non-empty directory that is not a Git repository " - "at {}\nPlease specify an existing Git repository, " - "an empty directory or a non-existing directory " - "path.".format(path)) - return repo - - -def git_commit_data(repo, data_dir, branch, message, exclude, notes): - """Commit data into a Git repository""" - log.info("Committing data into to branch %s", branch) - tmp_index = os.path.join(repo.git_dir, 'index.oe-git-archive') - try: - # Create new tree object from the data - env_update = {'GIT_INDEX_FILE': tmp_index, - 'GIT_WORK_TREE': os.path.abspath(data_dir)} - repo.run_cmd('add .', env_update) - - # Remove files that are excluded - if exclude: - repo.run_cmd(['rm', '--cached'] + [f for f in exclude], env_update) - - tree = repo.run_cmd('write-tree', env_update) - - # Create new commit object from the tree - parent = repo.rev_parse(branch) - git_cmd = ['commit-tree', tree, '-m', message] - if parent: - git_cmd += ['-p', parent] - commit = repo.run_cmd(git_cmd, env_update) - - # Create git notes - for ref, filename in notes: - ref = ref.format(branch_name=branch) - repo.run_cmd(['notes', '--ref', ref, 'add', - '-F', os.path.abspath(filename), commit]) - - # Update branch head - git_cmd = ['update-ref', 'refs/heads/' + branch, commit] - if parent: - git_cmd.append(parent) - repo.run_cmd(git_cmd) - - # Update current HEAD, if we're on branch 'branch' - if not repo.bare and repo.get_current_branch() == branch: - log.info("Updating %s HEAD to latest commit", repo.top_dir) - repo.run_cmd('reset --hard') - - return commit - finally: - if os.path.exists(tmp_index): - os.unlink(tmp_index) - - -def expand_tag_strings(repo, name_pattern, msg_subj_pattern, msg_body_pattern, - keywords): - """Generate tag name and message, with support for running id number""" - keyws = keywords.copy() - # Tag number is handled specially: if not defined, we autoincrement it - if 'tag_number' not in keyws: - # Fill in all other fields than 'tag_number' - keyws['tag_number'] = '{tag_number}' - tag_re = format_str(name_pattern, keyws) - # Replace parentheses for proper regex matching - tag_re = tag_re.replace('(', '\(').replace(')', '\)') + '$' - # Inject regex group pattern for 'tag_number' - tag_re = tag_re.format(tag_number='(?P<tag_number>[0-9]{1,5})') - - keyws['tag_number'] = 0 - for existing_tag in repo.run_cmd('tag').splitlines(): - match = re.match(tag_re, existing_tag) - - if match and int(match.group('tag_number')) >= keyws['tag_number']: - keyws['tag_number'] = int(match.group('tag_number')) + 1 - - tag_name = format_str(name_pattern, keyws) - msg_subj= format_str(msg_subj_pattern.strip(), keyws) - msg_body = format_str(msg_body_pattern, keyws) - return tag_name, msg_subj + '\n\n' + msg_body - - def parse_args(argv): """Parse command line arguments""" parser = argparse.ArgumentParser( @@ -217,17 +96,11 @@ def get_nested(d, list_of_keys): return "" def main(argv=None): - """Script entry point""" args = parse_args(argv) if args.debug: log.setLevel(logging.DEBUG) try: - if not os.path.isdir(args.data_dir): - raise ArchiveError("Not a directory: {}".format(args.data_dir)) - - data_repo = init_git_repo(args.git_dir, args.no_create, args.bare) - # Get keywords to be used in tag and branch names and messages metadata = metadata_from_bb() keywords = {'hostname': get_nested(metadata, ['hostname']), @@ -236,39 +109,12 @@ def main(argv=None): 'commit_count': get_nested(metadata, ['layers', 'meta', 'commit_count']), 'machine': get_nested(metadata, ['config', 'MACHINE'])} - # Expand strings early in order to avoid getting into inconsistent - # state (e.g. no tag even if data was committed) - commit_msg = format_str(args.commit_msg_subject.strip(), keywords) - commit_msg += '\n\n' + format_str(args.commit_msg_body, keywords) - branch_name = format_str(args.branch_name, keywords) - tag_name = None - if not args.no_tag and args.tag_name: - tag_name, tag_msg = expand_tag_strings(data_repo, args.tag_name, - args.tag_msg_subject, - args.tag_msg_body, keywords) - - # Commit data - commit = git_commit_data(data_repo, args.data_dir, branch_name, - commit_msg, args.exclude, args.notes) - - # Create tag - if tag_name: - log.info("Creating tag %s", tag_name) - data_repo.run_cmd(['tag', '-a', '-m', tag_msg, tag_name, commit]) - - # Push data to remote - if args.push: - cmd = ['push', '--tags'] - # If no remote is given we push with the default settings from - # gitconfig - if args.push is not True: - notes_refs = ['refs/notes/' + ref.format(branch_name=branch_name) - for ref, _ in args.notes] - cmd.extend([args.push, branch_name] + notes_refs) - log.info("Pushing data to remote") - data_repo.run_cmd(cmd) + gitarchive.gitarchive(args.data_dir, args.git_dir, args.no_create, args.bare, + args.commit_msg_subject.strip(), args.commit_msg_body, args.branch_name, + args.no_tag, args.tag_name, args.tag_msg_subject, args.tag_msg_body, + args.exclude, args.notes, args.push, keywords, log) - except ArchiveError as err: + except gitarchive.ArchiveError as err: log.error(str(err)) return 1 diff --git a/poky/scripts/oe-git-proxy b/poky/scripts/oe-git-proxy index 7a43fe6a6..1800942f3 100755 --- a/poky/scripts/oe-git-proxy +++ b/poky/scripts/oe-git-proxy @@ -131,8 +131,8 @@ if [ -z "$ALL_PROXY" ]; then fi # Connect directly to hosts in NO_PROXY -for H in ${NO_PROXY//,/ }; do - if match_host $1 $H; then +for H in "${NO_PROXY//,/ }"; do + if match_host $1 "$H"; then exec $SOCAT STDIO $METHOD fi done diff --git a/poky/scripts/runqemu b/poky/scripts/runqemu index 55cdd414e..1c96b29a4 100755 --- a/poky/scripts/runqemu +++ b/poky/scripts/runqemu @@ -188,6 +188,7 @@ class BaseConfig(object): self.qemu_opt = '' self.qemu_opt_script = '' + self.qemuparams = '' self.clean_nfs_dir = False self.nfs_server = '' self.rootfs = '' @@ -455,7 +456,7 @@ class BaseConfig(object): elif arg.startswith('biosfilename='): self.qemu_opt_script += ' -bios %s' % arg[len('biosfilename='):] elif arg.startswith('qemuparams='): - self.qemu_opt_script += ' %s' % arg[len('qemuparams='):] + self.qemuparams = ' %s' % arg[len('qemuparams='):] elif arg.startswith('bootparams='): self.bootparams = arg[len('bootparams='):] elif os.path.exists(arg) or (re.search(':', arg) and re.search('/', arg)): @@ -662,13 +663,28 @@ class BaseConfig(object): raise RunQemuError("Invalid custombiosdir: %s" % self.custombiosdir) def check_mem(self): - s = re.search('-m +([0-9]+)', self.qemu_opt_script) + """ + Both qemu and kernel needs memory settings, so check QB_MEM and set it + for both. + """ + s = re.search('-m +([0-9]+)', self.qemuparams) if s: self.set('QB_MEM', '-m %s' % s.group(1)) elif not self.get('QB_MEM'): logger.info('QB_MEM is not set, use 512M by default') self.set('QB_MEM', '-m 512') + # Check and remove M or m suffix + qb_mem = self.get('QB_MEM') + if qb_mem.endswith('M') or qb_mem.endswith('m'): + qb_mem = qb_mem[:-1] + + # Add -m prefix it not present + if not qb_mem.startswith('-m'): + qb_mem = '-m %s' % qb_mem + + self.set('QB_MEM', qb_mem) + mach = self.get('MACHINE') if not mach.startswith('qemumips'): self.kernel_cmdline_script += ' mem=%s' % self.get('QB_MEM').replace('-m','').strip() + 'M' @@ -1164,6 +1180,10 @@ class BaseConfig(object): self.qemu_opt += ' ' + self.qemu_opt_script + # Append qemuparams to override previous settings + if self.qemuparams: + self.qemu_opt += ' ' + self.qemuparams + if self.snapshot: self.qemu_opt += " -snapshot" diff --git a/poky/scripts/send-error-report b/poky/scripts/send-error-report index 8939f5f59..3528cf93a 100755 --- a/poky/scripts/send-error-report +++ b/poky/scripts/send-error-report @@ -62,7 +62,7 @@ def edit_content(json_file_path): def prepare_data(args): # attempt to get the max_log_size from the server's settings - max_log_size = getPayloadLimit("http://"+args.server+"/ClientPost/JSON") + max_log_size = getPayloadLimit("https://"+args.server+"/ClientPost/JSON") if not os.path.isfile(args.error_file): log.error("No data file found.") @@ -132,9 +132,9 @@ def send_data(data, args): headers={'Content-type': 'application/json', 'User-Agent': "send-error-report/"+version} if args.json: - url = "http://"+args.server+"/ClientPost/JSON/" + url = "https://"+args.server+"/ClientPost/JSON/" else: - url = "http://"+args.server+"/ClientPost/" + url = "https://"+args.server+"/ClientPost/" req = urllib.request.Request(url, data=data, headers=headers) try: diff --git a/poky/scripts/yocto-check-layer b/poky/scripts/yocto-check-layer index 9b7e53679..106c95525 100755 --- a/poky/scripts/yocto-check-layer +++ b/poky/scripts/yocto-check-layer @@ -22,7 +22,7 @@ import scriptpath scriptpath.add_oe_lib_path() scriptpath.add_bitbake_lib_path() -from checklayer import LayerType, detect_layers, add_layer, add_layer_dependencies, get_signatures +from checklayer import LayerType, detect_layers, add_layers, add_layer_dependencies, get_signatures from oeqa.utils.commands import get_bb_vars PROGNAME = 'yocto-check-layer' @@ -157,7 +157,7 @@ def main(): layers_tested = layers_tested + 1 continue - if any(map(lambda additional_layer: not add_layer(bblayersconf, additional_layer, dep_layers, logger), + if any(map(lambda additional_layer: not add_layers(bblayersconf, [additional_layer], logger), additional_layers)): logger.info('Skipping %s due to missing additional layers.' % layer['name']) results[layer['name']] = None @@ -179,7 +179,7 @@ def main(): continue td['machines'] = args.machines - if not add_layer(bblayersconf, layer, dep_layers, logger): + if not add_layers(bblayersconf, [layer], logger): logger.info('Skipping %s ???.' % layer['name']) results[layer['name']] = None results_status[layer['name']] = 'SKIPPED (Unknown)' diff --git a/poky/scripts/yocto-check-layer-wrapper b/poky/scripts/yocto-check-layer-wrapper index bbf6ee176..b5df9ce98 100755 --- a/poky/scripts/yocto-check-layer-wrapper +++ b/poky/scripts/yocto-check-layer-wrapper @@ -30,7 +30,9 @@ cd $base_dir build_dir=$(mktemp -p $base_dir -d -t build-XXXX) -source oe-init-build-env $build_dir +this_dir=$(dirname $(readlink -f $0)) + +source $this_dir/../oe-init-build-env $build_dir if [[ $output_log != '' ]]; then yocto-check-layer -o "$output_log" "$*" else |