Message ID | 20190925122349.14872-1-ross.burton@intel.com |
---|---|
State | New |
Headers | show |
Series | [thud] cve-check: backport rewrite from master | expand |
Hi Ross/Richard, I'd like this applied to Sumo also. Should I create a new patch and send it to the list, or is there a process for requesting this is cherry-picked across? Thanks, Ryan. On Wed, 25 Sep 2019 at 13:24, Ross Burton <ross.burton@intel.com> wrote: > As detailed at [1] the XML feeds provided by NIST are being discontinued on > October 9th 2019. As cve-check-tool uses these feeds, cve-check.bbclass > will be > inoperable after this date. > > To ensure that cve-check continues working, backport the following commits > from > master to move away from the unmaintained cve-check-tool to our own Python > code > that fetches the JSON: > > 546d14135c5 cve-update-db: New recipe to update CVE database > bc144b028f6 cve-check: Remove dependency to cve-check-tool-native > 7f62a20b32a cve-check: Manage CVE_PRODUCT with more than one name > 3bf63bc6084 cve-check: Consider CVE that affects versions with less than > operator > c0eabd30d7b cve-update-db: Use std library instead of urllib3 > 27eb839ee65 cve-check: be idiomatic > 09be21f4d17 cve-update-db: Manage proxy if needed. > 975793e3825 cve-update-db: do_populate_cve_db depends on do_fetch > 0325dd72714 cve-update-db: Catch request.urlopen errors. > 4078da92b49 cve-check: Depends on cve-update-db-native > f7676e9a38d cve-update-db: Use NVD CPE data to populate PRODUCTS table > bc0195be1b1 cve-check: Update unpatched CVE matching > c807c2a6409 cve-update-db-native: Skip recipe when cve-check class is not > loaded. > 07bb8b25e17 cve-check: remove redundant readline CVE whitelisting > 5388ed6d137 cve-check-tool: remove > 270ac00cb43 cve-check.bbclass: initialize to_append > e6bf9000987 cve-check: allow comparison of Vendor as well as Product > 91770338f76 cve-update-db-native: use SQL placeholders instead of format > strings > 7069302a4cc cve-check: Replace CVE_CHECK_CVE_WHITELIST by > CVE_CHECK_WHITELIST > 78de2cb39d7 cve-update-db-native: Remove hash column from database. > 4b301030cf9 cve-update-db-native: use os.path.join instead of + > f0d822fad2a cve-update-db: actually inherit native > b309840b6aa cve-update-db-native: use executemany() to optimise CPE > insertion > bb4e53af33d cve-update-db-native: improve metadata parsing > 94227459792 cve-update-db-native: clean up JSON fetching > 95438d52b73 cve-update-db-native: fix https proxy issues > 1f9a963b9ff glibc: exclude child recipes from CVE scanning > > [1] https://nvd.nist.gov/General/News/XML-Vulnerability-Feed-Retirement > > Signed-off-by: Ross Burton <ross.burton@intel.com> > --- > meta/classes/cve-check.bbclass | 142 +++++++----- > meta/conf/distro/include/maintainers.inc | 1 + > meta/recipes-core/glibc/glibc-locale.inc | 3 + > meta/recipes-core/glibc/glibc-mtrace.inc | 3 + > meta/recipes-core/glibc/glibc-scripts.inc | 3 + > .../recipes-core/meta/cve-update-db-native.bb | 195 ++++++++++++++++ > .../cve-check-tool/cve-check-tool_5.6.4.bb | 62 ----- > ...x-freeing-memory-allocated-by-sqlite.patch | 50 ---- > ...erriding-default-CA-certificate-file.patch | 215 ------------------ > ...s-in-percent-when-downloading-CVE-db.patch | 135 ----------- > ...omputed-vs-expected-sha256-digit-str.patch | 52 ----- > ...heck-for-malloc_trim-before-using-it.patch | 51 ----- > 12 files changed, 292 insertions(+), 620 deletions(-) > create mode 100644 meta/recipes-core/meta/cve-update-db-native.bb > delete mode 100644 meta/recipes-devtools/cve-check-tool/ > cve-check-tool_5.6.4.bb > delete mode 100644 > meta/recipes-devtools/cve-check-tool/files/0001-Fix-freeing-memory-allocated-by-sqlite.patch > delete mode 100644 > meta/recipes-devtools/cve-check-tool/files/0001-curl-allow-overriding-default-CA-certificate-file.patch > delete mode 100644 > meta/recipes-devtools/cve-check-tool/files/0001-print-progress-in-percent-when-downloading-CVE-db.patch > delete mode 100644 > meta/recipes-devtools/cve-check-tool/files/0001-update-Compare-computed-vs-expected-sha256-digit-str.patch > delete mode 100644 > meta/recipes-devtools/cve-check-tool/files/check-for-malloc_trim-before-using-it.patch > > diff --git a/meta/classes/cve-check.bbclass > b/meta/classes/cve-check.bbclass > index 743bc08a4f9..c00d2910be1 100644 > --- a/meta/classes/cve-check.bbclass > +++ b/meta/classes/cve-check.bbclass > @@ -26,7 +26,7 @@ CVE_PRODUCT ??= "${BPN}" > CVE_VERSION ??= "${PV}" > > CVE_CHECK_DB_DIR ?= "${DL_DIR}/CVE_CHECK" > -CVE_CHECK_DB_FILE ?= "${CVE_CHECK_DB_DIR}/nvd.db" > +CVE_CHECK_DB_FILE ?= "${CVE_CHECK_DB_DIR}/nvdcve_1.0.db" > > CVE_CHECK_LOG ?= "${T}/cve.log" > CVE_CHECK_TMP_FILE ?= "${TMPDIR}/cve_check" > @@ -37,32 +37,33 @@ CVE_CHECK_COPY_FILES ??= "1" > CVE_CHECK_CREATE_MANIFEST ??= "1" > > # Whitelist for packages (PN) > -CVE_CHECK_PN_WHITELIST = "\ > - glibc-locale \ > -" > +CVE_CHECK_PN_WHITELIST ?= "" > > -# Whitelist for CVE and version of package > -CVE_CHECK_CVE_WHITELIST = "{\ > - 'CVE-2014-2524': ('6.3','5.2',), \ > -}" > +# Whitelist for CVE. If a CVE is found, then it is considered patched. > +# The value is a string containing space separated CVE values: > +# > +# CVE_CHECK_WHITELIST = 'CVE-2014-2524 CVE-2018-1234' > +# > +CVE_CHECK_WHITELIST ?= "" > > python do_cve_check () { > """ > Check recipe for patched and unpatched CVEs > """ > > - if os.path.exists(d.getVar("CVE_CHECK_TMP_FILE")): > + if os.path.exists(d.getVar("CVE_CHECK_DB_FILE")): > patched_cves = get_patches_cves(d) > patched, unpatched = check_cves(d, patched_cves) > if patched or unpatched: > cve_data = get_cve_info(d, patched + unpatched) > cve_write_data(d, patched, unpatched, cve_data) > else: > - bb.note("Failed to update CVE database, skipping CVE check") > + bb.note("No CVE database found, skipping CVE check") > + > } > > addtask cve_check after do_unpack before do_build > -do_cve_check[depends] = "cve-check-tool-native:do_populate_sysroot > cve-check-tool-native:do_populate_cve_db" > +do_cve_check[depends] = "cve-update-db-native:do_populate_cve_db" > do_cve_check[nostamp] = "1" > > python cve_check_cleanup () { > @@ -163,65 +164,94 @@ def get_patches_cves(d): > > def check_cves(d, patched_cves): > """ > - Run cve-check-tool looking for patched and unpatched CVEs. > + Connect to the NVD database and find unpatched cves. > """ > - > import ast, csv, tempfile, subprocess, io > + from distutils.version import LooseVersion > > - cves_patched = [] > cves_unpatched = [] > - bpn = d.getVar("CVE_PRODUCT") > + # CVE_PRODUCT can contain more than one product (eg. curl/libcurl) > + products = d.getVar("CVE_PRODUCT").split() > # If this has been unset then we're not scanning for CVEs here (for > example, image recipes) > - if not bpn: > + if not products: > return ([], []) > pv = d.getVar("CVE_VERSION").split("+git")[0] > - cves = " ".join(patched_cves) > - cve_db_dir = d.getVar("CVE_CHECK_DB_DIR") > - cve_whitelist = ast.literal_eval(d.getVar("CVE_CHECK_CVE_WHITELIST")) > - cve_cmd = "cve-check-tool" > - cmd = [cve_cmd, "--no-html", "--skip-update", "--csv", > "--not-affected", "-t", "faux", "-d", cve_db_dir] > > # If the recipe has been whitlisted we return empty lists > if d.getVar("PN") in d.getVar("CVE_CHECK_PN_WHITELIST").split(): > bb.note("Recipe has been whitelisted, skipping check") > return ([], []) > > - try: > - # Write the faux CSV file to be used with cve-check-tool > - fd, faux = tempfile.mkstemp(prefix="cve-faux-") > - with os.fdopen(fd, "w") as f: > - for pn in bpn.split(): > - f.write("%s,%s,%s,\n" % (pn, pv, cves)) > - cmd.append(faux) > - > - output = subprocess.check_output(cmd).decode("utf-8") > - bb.debug(2, "Output of command %s:\n%s" % ("\n".join(cmd), > output)) > - except subprocess.CalledProcessError as e: > - bb.warn("Couldn't check for CVEs: %s (output %s)" % (e, e.output)) > - finally: > - os.remove(faux) > - > - for row in csv.reader(io.StringIO(output)): > - # Third row has the unpatched CVEs > - if row[2]: > - for cve in row[2].split(): > - # Skip if the CVE has been whitlisted for the current > version > - if pv in cve_whitelist.get(cve,[]): > - bb.note("%s-%s has been whitelisted for %s" % (bpn, > pv, cve)) > + old_cve_whitelist = d.getVar("CVE_CHECK_CVE_WHITELIST") > + if old_cve_whitelist: > + bb.warn("CVE_CHECK_CVE_WHITELIST is deprecated, please use > CVE_CHECK_WHITELIST.") > + cve_whitelist = d.getVar("CVE_CHECK_WHITELIST").split() > + > + import sqlite3 > + db_file = d.getVar("CVE_CHECK_DB_FILE") > + conn = sqlite3.connect(db_file) > + > + for product in products: > + c = conn.cursor() > + if ":" in product: > + vendor, product = product.split(":", 1) > + c.execute("SELECT * FROM PRODUCTS WHERE PRODUCT IS ? AND > VENDOR IS ?", (product, vendor)) > + else: > + c.execute("SELECT * FROM PRODUCTS WHERE PRODUCT IS ?", > (product,)) > + > + for row in c: > + cve = row[0] > + version_start = row[3] > + operator_start = row[4] > + version_end = row[5] > + operator_end = row[6] > + > + if cve in cve_whitelist: > + bb.note("%s-%s has been whitelisted for %s" % (product, > pv, cve)) > + elif cve in patched_cves: > + bb.note("%s has been patched" % (cve)) > + else: > + to_append = False > + if (operator_start == '=' and pv == version_start): > + cves_unpatched.append(cve) > else: > + if operator_start: > + try: > + to_append_start = (operator_start == '>=' > and LooseVersion(pv) >= LooseVersion(version_start)) > + to_append_start |= (operator_start == '>' and > LooseVersion(pv) > LooseVersion(version_start)) > + except: > + bb.note("%s: Failed to compare %s %s %s for > %s" % > + (product, pv, operator_start, > version_start, cve)) > + to_append_start = False > + else: > + to_append_start = False > + > + if operator_end: > + try: > + to_append_end = (operator_end == '<=' and > LooseVersion(pv) <= LooseVersion(version_end)) > + to_append_end |= (operator_end == '<' and > LooseVersion(pv) < LooseVersion(version_end)) > + except: > + bb.note("%s: Failed to compare %s %s %s for > %s" % > + (product, pv, operator_end, > version_end, cve)) > + to_append_end = False > + else: > + to_append_end = False > + > + if operator_start and operator_end: > + to_append = to_append_start and to_append_end > + else: > + to_append = to_append_start or to_append_end > + > + if to_append: > cves_unpatched.append(cve) > - bb.debug(2, "%s-%s is not patched for %s" % (bpn, pv, > cve)) > - # Fourth row has patched CVEs > - if row[3]: > - for cve in row[3].split(): > - cves_patched.append(cve) > - bb.debug(2, "%s-%s is patched for %s" % (bpn, pv, cve)) > + bb.debug(2, "%s-%s is not patched for %s" % (product, pv, > cve)) > + conn.close() > > - return (cves_patched, cves_unpatched) > + return (list(patched_cves), cves_unpatched) > > def get_cve_info(d, cves): > """ > - Get CVE information from the database used by cve-check-tool. > + Get CVE information from the database. > > Unfortunately the only way to get CVE info is set the output to > html (hard to parse) or query directly the database. > @@ -241,9 +271,10 @@ def get_cve_info(d, cves): > for row in cur.execute(query, tuple(cves)): > cve_data[row[0]] = {} > cve_data[row[0]]["summary"] = row[1] > - cve_data[row[0]]["score"] = row[2] > - cve_data[row[0]]["modified"] = row[3] > - cve_data[row[0]]["vector"] = row[4] > + cve_data[row[0]]["scorev2"] = row[2] > + cve_data[row[0]]["scorev3"] = row[3] > + cve_data[row[0]]["modified"] = row[4] > + cve_data[row[0]]["vector"] = row[5] > conn.close() > > return cve_data > @@ -270,7 +301,8 @@ def cve_write_data(d, patched, unpatched, cve_data): > unpatched_cves.append(cve) > write_string += "CVE STATUS: Unpatched\n" > write_string += "CVE SUMMARY: %s\n" % cve_data[cve]["summary"] > - write_string += "CVSS v2 BASE SCORE: %s\n" % > cve_data[cve]["score"] > + write_string += "CVSS v2 BASE SCORE: %s\n" % > cve_data[cve]["scorev2"] > + write_string += "CVSS v3 BASE SCORE: %s\n" % > cve_data[cve]["scorev3"] > write_string += "VECTOR: %s\n" % cve_data[cve]["vector"] > write_string += "MORE INFORMATION: %s%s\n\n" % (nvd_link, cve) > > diff --git a/meta/conf/distro/include/maintainers.inc > b/meta/conf/distro/include/maintainers.inc > index 672f0677922..c027901fdf0 100644 > --- a/meta/conf/distro/include/maintainers.inc > +++ b/meta/conf/distro/include/maintainers.inc > @@ -116,6 +116,7 @@ RECIPE_MAINTAINER_pn-cryptodev-tests = "Robert Yang < > liezhi.yang@windriver.com>" > RECIPE_MAINTAINER_pn-cups = "Chen Qi <Qi.Chen@windriver.com>" > RECIPE_MAINTAINER_pn-curl = "Armin Kuster <akuster808@gmail.com>" > RECIPE_MAINTAINER_pn-cve-check-tool = "Ross Burton <ross.burton@intel.com > >" > +RECIPE_MAINTAINER_pn-cve-update-db-native = "Ross Burton < > ross.burton@intel.com>" > RECIPE_MAINTAINER_pn-cwautomacros = "Ross Burton <ross.burton@intel.com>" > RECIPE_MAINTAINER_pn-db = "Mark Hatle <mark.hatle@windriver.com>" > RECIPE_MAINTAINER_pn-dbus = "Chen Qi <Qi.Chen@windriver.com>" > diff --git a/meta/recipes-core/glibc/glibc-locale.inc > b/meta/recipes-core/glibc/glibc-locale.inc > index 1b676dc26e7..97d83cb856d 100644 > --- a/meta/recipes-core/glibc/glibc-locale.inc > +++ b/meta/recipes-core/glibc/glibc-locale.inc > @@ -95,3 +95,6 @@ do_install () { > inherit libc-package > > BBCLASSEXTEND = "nativesdk" > + > +# Don't scan for CVEs as glibc will be scanned > +CVE_PRODUCT = "" > diff --git a/meta/recipes-core/glibc/glibc-mtrace.inc > b/meta/recipes-core/glibc/glibc-mtrace.inc > index d703c14bdc1..ef9d60ec239 100644 > --- a/meta/recipes-core/glibc/glibc-mtrace.inc > +++ b/meta/recipes-core/glibc/glibc-mtrace.inc > @@ -11,3 +11,6 @@ do_install() { > install -d -m 0755 ${D}${bindir} > install -m 0755 ${SRC}/mtrace ${D}${bindir}/ > } > + > +# Don't scan for CVEs as glibc will be scanned > +CVE_PRODUCT = "" > diff --git a/meta/recipes-core/glibc/glibc-scripts.inc > b/meta/recipes-core/glibc/glibc-scripts.inc > index 2a2b41507ed..14a14e45126 100644 > --- a/meta/recipes-core/glibc/glibc-scripts.inc > +++ b/meta/recipes-core/glibc/glibc-scripts.inc > @@ -18,3 +18,6 @@ do_install() { > # sotruss script requires sotruss-lib.so (given by libsotruss package), > # to produce trace of the library calls. > RDEPENDS_${PN} += "libsotruss" > + > +# Don't scan for CVEs as glibc will be scanned > +CVE_PRODUCT = "" > diff --git a/meta/recipes-core/meta/cve-update-db-native.bb > b/meta/recipes-core/meta/cve-update-db-native.bb > new file mode 100644 > index 00000000000..2c427a5884f > --- /dev/null > +++ b/meta/recipes-core/meta/cve-update-db-native.bb > @@ -0,0 +1,195 @@ > +SUMMARY = "Updates the NVD CVE database" > +LICENSE = "MIT" > + > +INHIBIT_DEFAULT_DEPS = "1" > + > +inherit native > + > +deltask do_unpack > +deltask do_patch > +deltask do_configure > +deltask do_compile > +deltask do_install > +deltask do_populate_sysroot > + > +python () { > + if not d.getVar("CVE_CHECK_DB_FILE"): > + raise bb.parse.SkipRecipe("Skip recipe when cve-check class is > not loaded.") > +} > + > +python do_populate_cve_db() { > + """ > + Update NVD database with json data feed > + """ > + > + import sqlite3, urllib, urllib.parse, shutil, gzip > + from datetime import date > + > + BASE_URL = "https://nvd.nist.gov/feeds/json/cve/1.0/nvdcve-1.0-" > + YEAR_START = 2002 > + > + db_dir = os.path.join(d.getVar("DL_DIR"), 'CVE_CHECK') > + db_file = os.path.join(db_dir, 'nvdcve_1.0.db') > + json_tmpfile = os.path.join(db_dir, 'nvd.json.gz') > + proxy = d.getVar("https_proxy") > + > + if proxy: > + # instantiate an opener but do not install it as the global > + # opener unless if we're really sure it's applicable for all > + # urllib requests > + proxy_handler = urllib.request.ProxyHandler({'https': proxy}) > + proxy_opener = urllib.request.build_opener(proxy_handler) > + else: > + proxy_opener = None > + > + cve_f = open(os.path.join(d.getVar("TMPDIR"), 'cve_check'), 'a') > + > + if not os.path.isdir(db_dir): > + os.mkdir(db_dir) > + > + # Connect to database > + conn = sqlite3.connect(db_file) > + c = conn.cursor() > + > + initialize_db(c) > + > + for year in range(YEAR_START, date.today().year + 1): > + year_url = BASE_URL + str(year) > + meta_url = year_url + ".meta" > + json_url = year_url + ".json.gz" > + > + # Retrieve meta last modified date > + > + response = None > + > + if proxy_opener: > + response = proxy_opener.open(meta_url) > + else: > + req = urllib.request.Request(meta_url) > + response = urllib.request.urlopen(req) > + > + if response: > + for l in response.read().decode("utf-8").splitlines(): > + key, value = l.split(":", 1) > + if key == "lastModifiedDate": > + last_modified = value > + break > + else: > + bb.warn("Cannot parse CVE metadata, update failed") > + return > + > + # Compare with current db last modified date > + c.execute("select DATE from META where YEAR = ?", (year,)) > + meta = c.fetchone() > + if not meta or meta[0] != last_modified: > + # Clear products table entries corresponding to current year > + c.execute("delete from PRODUCTS where ID like ?", ('CVE-%d%%' > % year,)) > + > + # Update db with current year json file > + try: > + if proxy_opener: > + response = proxy_opener.open(json_url) > + else: > + req = urllib.request.Request(json_url) > + response = urllib.request.urlopen(req) > + > + if response: > + update_db(c, > gzip.decompress(response.read()).decode('utf-8')) > + c.execute("insert or replace into META values (?, ?)", > [year, last_modified]) > + except urllib.error.URLError as e: > + cve_f.write('Warning: CVE db update error, CVE data is > outdated.\n\n') > + bb.warn("Cannot parse CVE data (%s), update failed" % > e.reason) > + return > + > + # Update success, set the date to cve_check file. > + if year == date.today().year: > + cve_f.write('CVE database update : %s\n\n' % date.today()) > + > + cve_f.close() > + conn.commit() > + conn.close() > +} > + > +def initialize_db(c): > + c.execute("CREATE TABLE IF NOT EXISTS META (YEAR INTEGER UNIQUE, DATE > TEXT)") > + c.execute("CREATE TABLE IF NOT EXISTS NVD (ID TEXT UNIQUE, SUMMARY > TEXT, \ > + SCOREV2 TEXT, SCOREV3 TEXT, MODIFIED INTEGER, VECTOR TEXT)") > + c.execute("CREATE TABLE IF NOT EXISTS PRODUCTS (ID TEXT, \ > + VENDOR TEXT, PRODUCT TEXT, VERSION_START TEXT, OPERATOR_START > TEXT, \ > + VERSION_END TEXT, OPERATOR_END TEXT)") > + > +def parse_node_and_insert(c, node, cveId): > + # Parse children node if needed > + for child in node.get('children', ()): > + parse_node_and_insert(c, child, cveId) > + > + def cpe_generator(): > + for cpe in node.get('cpe_match', ()): > + if not cpe['vulnerable']: > + return > + cpe23 = cpe['cpe23Uri'].split(':') > + vendor = cpe23[3] > + product = cpe23[4] > + version = cpe23[5] > + > + if version != '*': > + # Version is defined, this is a '=' match > + yield [cveId, vendor, product, version, '=', '', ''] > + else: > + # Parse start version, end version and operators > + op_start = '' > + op_end = '' > + v_start = '' > + v_end = '' > + > + if 'versionStartIncluding' in cpe: > + op_start = '>=' > + v_start = cpe['versionStartIncluding'] > + > + if 'versionStartExcluding' in cpe: > + op_start = '>' > + v_start = cpe['versionStartExcluding'] > + > + if 'versionEndIncluding' in cpe: > + op_end = '<=' > + v_end = cpe['versionEndIncluding'] > + > + if 'versionEndExcluding' in cpe: > + op_end = '<' > + v_end = cpe['versionEndExcluding'] > + > + yield [cveId, vendor, product, v_start, op_start, v_end, > op_end] > + > + c.executemany("insert into PRODUCTS values (?, ?, ?, ?, ?, ?, ?)", > cpe_generator()) > + > +def update_db(c, jsondata): > + import json > + root = json.loads(jsondata) > + > + for elt in root['CVE_Items']: > + if not elt['impact']: > + continue > + > + cveId = elt['cve']['CVE_data_meta']['ID'] > + cveDesc = > elt['cve']['description']['description_data'][0]['value'] > + date = elt['lastModifiedDate'] > + accessVector = > elt['impact']['baseMetricV2']['cvssV2']['accessVector'] > + cvssv2 = elt['impact']['baseMetricV2']['cvssV2']['baseScore'] > + > + try: > + cvssv3 = elt['impact']['baseMetricV3']['cvssV3']['baseScore'] > + except: > + cvssv3 = 0.0 > + > + c.execute("insert or replace into NVD values (?, ?, ?, ?, ?, ?)", > + [cveId, cveDesc, cvssv2, cvssv3, date, accessVector]) > + > + configurations = elt['configurations']['nodes'] > + for config in configurations: > + parse_node_and_insert(c, config, cveId) > + > + > +addtask do_populate_cve_db before do_fetch > +do_populate_cve_db[nostamp] = "1" > + > +EXCLUDE_FROM_WORLD = "1" > diff --git a/meta/recipes-devtools/cve-check-tool/cve-check-tool_5.6.4.bb > b/meta/recipes-devtools/cve-check-tool/cve-check-tool_5.6.4.bb > deleted file mode 100644 > index 1c84fb1cf2d..00000000000 > --- a/meta/recipes-devtools/cve-check-tool/cve-check-tool_5.6.4.bb > +++ /dev/null > @@ -1,62 +0,0 @@ > -SUMMARY = "cve-check-tool" > -DESCRIPTION = "cve-check-tool is a tool for checking known (public) CVEs.\ > -The tool will identify potentially vunlnerable software packages within > Linux distributions through version matching." > -HOMEPAGE = "https://github.com/ikeydoherty/cve-check-tool" > -SECTION = "Development/Tools" > -LICENSE = "GPL-2.0+" > -LIC_FILES_CHKSUM = "file://LICENSE;md5=e8c1458438ead3c34974bc0be3a03ed6" > - > -SRC_URI = " > https://github.com/ikeydoherty/${BPN}/releases/download/v${PV}/${BP}.tar.xz > \ > - file://check-for-malloc_trim-before-using-it.patch \ > - > file://0001-print-progress-in-percent-when-downloading-CVE-db.patch \ > - > file://0001-curl-allow-overriding-default-CA-certificate-file.patch \ > - > file://0001-update-Compare-computed-vs-expected-sha256-digit-str.patch \ > - file://0001-Fix-freeing-memory-allocated-by-sqlite.patch \ > - " > - > -SRC_URI[md5sum] = "c5f4247140fc9be3bf41491d31a34155" > -SRC_URI[sha256sum] = > "b8f283be718af8d31232ac1bfc10a0378fb958aaaa49af39168f8acf501e6a5b" > - > -UPSTREAM_CHECK_URI = " > https://github.com/ikeydoherty/cve-check-tool/releases" > - > -DEPENDS = "libcheck glib-2.0 json-glib curl libxml2 sqlite3 openssl > ca-certificates" > - > -RDEPENDS_${PN} = "ca-certificates" > - > -inherit pkgconfig autotools > - > -EXTRA_OECONF = "--disable-coverage --enable-relative-plugins" > -CFLAGS_append = " -Wno-error=pedantic" > - > -do_populate_cve_db() { > - if [ "${BB_NO_NETWORK}" = "1" ] ; then > - bbwarn "BB_NO_NETWORK is set; Can't update cve-check-tool > database, new CVEs won't be detected" > - return > - fi > - > - # In case we don't inherit cve-check class, use default values > defined in the class. > - cve_dir="${CVE_CHECK_DB_DIR}" > - cve_file="${CVE_CHECK_TMP_FILE}" > - > - [ -z "${cve_dir}" ] && cve_dir="${DL_DIR}/CVE_CHECK" > - [ -z "${cve_file}" ] && cve_file="${TMPDIR}/cve_check" > - > - unused="${@bb.utils.export_proxies(d)}" > - bbdebug 2 "Updating cve-check-tool database located in $cve_dir" > - # --cacert works around curl-native not finding the CA bundle > - if cve-check-update --cacert > ${sysconfdir}/ssl/certs/ca-certificates.crt -d "$cve_dir" ; then > - printf "CVE database was updated on %s UTC\n\n" "$(LANG=C date > --utc +'%F %T')" > "$cve_file" > - else > - bbwarn "Error in executing cve-check-update" > - if [ "${@'1' if bb.data.inherits_class('cve-check', d) else '0'}" > -ne 0 ] ; then > - bbwarn "Failed to update cve-check-tool database, CVEs won't > be checked" > - fi > - fi > -} > - > -addtask populate_cve_db after do_populate_sysroot > -do_populate_cve_db[depends] = "cve-check-tool-native:do_populate_sysroot" > -do_populate_cve_db[nostamp] = "1" > -do_populate_cve_db[progress] = "percent" > - > -BBCLASSEXTEND = "native nativesdk" > diff --git > a/meta/recipes-devtools/cve-check-tool/files/0001-Fix-freeing-memory-allocated-by-sqlite.patch > b/meta/recipes-devtools/cve-check-tool/files/0001-Fix-freeing-memory-allocated-by-sqlite.patch > deleted file mode 100644 > index 4a82cf2dded..00000000000 > --- > a/meta/recipes-devtools/cve-check-tool/files/0001-Fix-freeing-memory-allocated-by-sqlite.patch > +++ /dev/null > @@ -1,50 +0,0 @@ > -From a3353429652f83bb8b0316500faa88fa2555542d Mon Sep 17 00:00:00 2001 > -From: Peter Marko <peter.marko@siemens.com> > -Date: Thu, 13 Apr 2017 23:09:52 +0200 > -Subject: [PATCH] Fix freeing memory allocated by sqlite > - > -Upstream-Status: Backport > -Signed-off-by: Peter Marko <peter.marko@siemens.com> > ---- > - src/core.c | 8 ++++---- > - 1 file changed, 4 insertions(+), 4 deletions(-) > - > -diff --git a/src/core.c b/src/core.c > -index 6263031..6788f16 100644 > ---- a/src/core.c > -+++ b/src/core.c > -@@ -82,7 +82,7 @@ static bool ensure_table(CveDB *self) > - rc = sqlite3_exec(self->db, query, NULL, NULL, &err); > - if (rc != SQLITE_OK) { > - fprintf(stderr, "ensure_table(): %s\n", err); > -- free(err); > -+ sqlite3_free(err); > - return false; > - } > - > -@@ -91,7 +91,7 @@ static bool ensure_table(CveDB *self) > - rc = sqlite3_exec(self->db, query, NULL, NULL, &err); > - if (rc != SQLITE_OK) { > - fprintf(stderr, "ensure_table(): %s\n", err); > -- free(err); > -+ sqlite3_free(err); > - return false; > - } > - > -@@ -99,11 +99,11 @@ static bool ensure_table(CveDB *self) > - rc = sqlite3_exec(self->db, query, NULL, NULL, &err); > - if (rc != SQLITE_OK) { > - fprintf(stderr, "ensure_table(): %s\n", err); > -- free(err); > -+ sqlite3_free(err); > - return false; > - } > - if (err) { > -- free(err); > -+ sqlite3_free(err); > - } > - > - return true; > --- > -2.1.4 > - > diff --git > a/meta/recipes-devtools/cve-check-tool/files/0001-curl-allow-overriding-default-CA-certificate-file.patch > b/meta/recipes-devtools/cve-check-tool/files/0001-curl-allow-overriding-default-CA-certificate-file.patch > deleted file mode 100644 > index 3d8ebd1bd26..00000000000 > --- > a/meta/recipes-devtools/cve-check-tool/files/0001-curl-allow-overriding-default-CA-certificate-file.patch > +++ /dev/null > @@ -1,215 +0,0 @@ > -From 825a9969dea052b02ba868bdf39e676349f10dce Mon Sep 17 00:00:00 2001 > -From: Jussi Kukkonen <jussi.kukkonen@intel.com> > -Date: Thu, 9 Feb 2017 14:51:28 +0200 > -Subject: [PATCH] curl: allow overriding default CA certificate file > - > -Similar to curl, --cacert can now be used in cve-check-tool and > -cve-check-update to override the default CA certificate file. Useful > -in cases where the system default is unsuitable (for example, > -out-dated) or broken (as in OE's current native libcurl, which embeds > -a path string from one build host and then uses it on another although > -the right path may have become something different). > - > -Upstream-Status: Submitted [ > https://github.com/ikeydoherty/cve-check-tool/pull/45] > - > -Signed-off-by: Patrick Ohly <patrick.ohly@intel.com> > - > - > -Took Patrick Ohlys original patch from meta-security-isafw, rebased > -on top of other patches. > - > -Signed-off-by: Jussi Kukkonen <jussi.kukkonen@intel.com> > ---- > - src/library/cve-check-tool.h | 1 + > - src/library/fetch.c | 10 +++++++++- > - src/library/fetch.h | 3 ++- > - src/main.c | 5 ++++- > - src/update-main.c | 4 +++- > - src/update.c | 12 +++++++----- > - src/update.h | 2 +- > - 7 files changed, 27 insertions(+), 10 deletions(-) > - > -diff --git a/src/library/cve-check-tool.h b/src/library/cve-check-tool.h > -index e4bb5b1..f89eade 100644 > ---- a/src/library/cve-check-tool.h > -+++ b/src/library/cve-check-tool.h > -@@ -43,6 +43,7 @@ typedef struct CveCheckTool { > - bool bugs; /**<Whether bug tracking is > enabled */ > - GHashTable *mapping; /**<CVE Mapping */ > - const char *output_file; /**<Output file, if any */ > -+ const char *cacert_file; /**<Non-default SSL certificate > file, if any */ > - } CveCheckTool; > - > - /** > -diff --git a/src/library/fetch.c b/src/library/fetch.c > -index 0fe6d76..8f998c3 100644 > ---- a/src/library/fetch.c > -+++ b/src/library/fetch.c > -@@ -60,7 +60,8 @@ static int progress_callback_new(void *ptr, curl_off_t > dltotal, curl_off_t dlnow > - } > - > - FetchStatus fetch_uri(const char *uri, const char *target, bool verbose, > -- unsigned int start_percent, unsigned int > end_percent) > -+ unsigned int start_percent, unsigned int > end_percent, > -+ const char *cacert_file) > - { > - FetchStatus ret = FETCH_STATUS_FAIL; > - CURLcode res; > -@@ -74,6 +75,13 @@ FetchStatus fetch_uri(const char *uri, const char > *target, bool verbose, > - return ret; > - } > - > -+ if (cacert_file) { > -+ res = curl_easy_setopt(curl, CURLOPT_CAINFO, > cacert_file); > -+ if (res != CURLE_OK) { > -+ goto bail; > -+ } > -+ } > -+ > - if (stat(target, &st) == 0) { > - res = curl_easy_setopt(curl, CURLOPT_TIMECONDITION, > CURL_TIMECOND_IFMODSINCE); > - if (res != CURLE_OK) { > -diff --git a/src/library/fetch.h b/src/library/fetch.h > -index 4cce5d1..836c7d7 100644 > ---- a/src/library/fetch.h > -+++ b/src/library/fetch.h > -@@ -29,7 +29,8 @@ typedef enum { > - * @return A FetchStatus, indicating the operation taken > - */ > - FetchStatus fetch_uri(const char *uri, const char *target, bool verbose, > -- unsigned int this_percent, unsigned int > next_percent); > -+ unsigned int this_percent, unsigned int > next_percent, > -+ const char *cacert_file); > - > - /** > - * Attempt to extract the given gzipped file > -diff --git a/src/main.c b/src/main.c > -index 8e6f158..ae69d47 100644 > ---- a/src/main.c > -+++ b/src/main.c > -@@ -280,6 +280,7 @@ static bool csv_mode = false; > - static char *modified_stamp = NULL; > - static gchar *mapping_file = NULL; > - static gchar *output_file = NULL; > -+static gchar *cacert_file = NULL; > - > - static GOptionEntry _entries[] = { > - { "not-patched", 'n', 0, G_OPTION_ARG_NONE, &hide_patched, "Hide > patched/addressed CVEs", NULL }, > -@@ -294,6 +295,7 @@ static GOptionEntry _entries[] = { > - { "csv", 'c', 0, G_OPTION_ARG_NONE, &csv_mode, "Output CSV > formatted data only", NULL }, > - { "mapping", 'M', 0, G_OPTION_ARG_STRING, &mapping_file, "Path > to a mapping file", NULL}, > - { "output-file", 'o', 0, G_OPTION_ARG_STRING, &output_file, > "Path to the output file (output plugin specific)", NULL}, > -+ { "cacert", 'C', 0, G_OPTION_ARG_STRING, &cacert_file, "Path to > the combined SSL certificates file (system default is used if not set)", > NULL}, > - { .short_name = 0 } > - }; > - > -@@ -492,6 +494,7 @@ int main(int argc, char **argv) > - > - quiet = csv_mode || !no_html; > - self->output_file = output_file; > -+ self->cacert_file = cacert_file; > - > - if (!csv_mode && self->output_file) { > - quiet = false; > -@@ -530,7 +533,7 @@ int main(int argc, char **argv) > - if (status) { > - fprintf(stderr, "Update of db forced\n"); > - cve_db_unlock(); > -- if (!update_db(quiet, db_path->str)) { > -+ if (!update_db(quiet, db_path->str, > self->cacert_file)) { > - fprintf(stderr, "DB update failure\n"); > - goto cleanup; > - } > -diff --git a/src/update-main.c b/src/update-main.c > -index 2379cfa..c52d9d0 100644 > ---- a/src/update-main.c > -+++ b/src/update-main.c > -@@ -43,11 +43,13 @@ the Free Software Foundation; either version 2 of the > License, or\n\ > - static gchar *nvds = NULL; > - static bool _show_version = false; > - static bool _quiet = false; > -+static const char *_cacert_file = NULL; > - > - static GOptionEntry _entries[] = { > - { "nvd-dir", 'd', 0, G_OPTION_ARG_STRING, &nvds, "NVD directory > in filesystem", NULL }, > - { "version", 'v', 0, G_OPTION_ARG_NONE, &_show_version, "Show > version", NULL }, > - { "quiet", 'q', 0, G_OPTION_ARG_NONE, &_quiet, "Run silently", > NULL }, > -+ { "cacert", 'C', 0, G_OPTION_ARG_STRING, &_cacert_file, "Path to > the combined SSL certificates file (system default is used if not set)", > NULL}, > - { .short_name = 0 } > - }; > - > -@@ -88,7 +90,7 @@ int main(int argc, char **argv) > - goto end; > - } > - > -- if (update_db(_quiet, db_path->str)) { > -+ if (update_db(_quiet, db_path->str, _cacert_file)) { > - ret = EXIT_SUCCESS; > - } else { > - fprintf(stderr, "Failed to update database\n"); > -diff --git a/src/update.c b/src/update.c > -index 070560a..8cb4a39 100644 > ---- a/src/update.c > -+++ b/src/update.c > -@@ -267,7 +267,8 @@ static inline void update_end(int fd, const char > *update_fname, bool ok) > - > - static int do_fetch_update(int year, const char *db_dir, CveDB *cve_db, > - bool db_exist, bool verbose, > -- unsigned int this_percent, unsigned int > next_percent) > -+ unsigned int this_percent, unsigned int > next_percent, > -+ const char *cacert_file) > - { > - const char nvd_uri[] = URI_PREFIX; > - autofree(cve_string) *uri_meta = NULL; > -@@ -331,14 +332,14 @@ refetch: > - } > - > - /* Fetch NVD META file */ > -- st = fetch_uri(uri_meta->str, nvdcve_meta->str, verbose, > this_percent, this_percent); > -+ st = fetch_uri(uri_meta->str, nvdcve_meta->str, verbose, > this_percent, this_percent, cacert_file); > - if (st == FETCH_STATUS_FAIL) { > - fprintf(stderr, "Failed to fetch %s\n", uri_meta->str); > - return -1; > - } > - > - /* Fetch NVD XML file */ > -- st = fetch_uri(uri_data_gz->str, nvdcve_data_gz->str, verbose, > this_percent, next_percent); > -+ st = fetch_uri(uri_data_gz->str, nvdcve_data_gz->str, verbose, > this_percent, next_percent, cacert_file); > - switch (st) { > - case FETCH_STATUS_FAIL: > - fprintf(stderr, "Failed to fetch %s\n", > uri_data_gz->str); > -@@ -391,7 +392,7 @@ refetch: > - return 0; > - } > - > --bool update_db(bool quiet, const char *db_file) > -+bool update_db(bool quiet, const char *db_file, const char *cacert_file) > - { > - autofree(char) *db_dir = NULL; > - autofree(CveDB) *cve_db = NULL; > -@@ -466,7 +467,8 @@ bool update_db(bool quiet, const char *db_file) > - if (!quiet) > - fprintf(stderr, "completed: %u%%\r", > start_percent); > - rc = do_fetch_update(y, db_dir, cve_db, db_exist, !quiet, > -- start_percent, end_percent); > -+ start_percent, end_percent, > -+ cacert_file); > - switch (rc) { > - case 0: > - if (!quiet) > -diff --git a/src/update.h b/src/update.h > -index b8e9911..ceea0c3 100644 > ---- a/src/update.h > -+++ b/src/update.h > -@@ -15,7 +15,7 @@ cve_string *get_db_path(const char *path); > - > - int update_required(const char *db_file); > - > --bool update_db(bool quiet, const char *db_file); > -+bool update_db(bool quiet, const char *db_file, const char *cacert_file); > - > - > - /* > --- > -2.1.4 > - > diff --git > a/meta/recipes-devtools/cve-check-tool/files/0001-print-progress-in-percent-when-downloading-CVE-db.patch > b/meta/recipes-devtools/cve-check-tool/files/0001-print-progress-in-percent-when-downloading-CVE-db.patch > deleted file mode 100644 > index 8ea6f686e3f..00000000000 > --- > a/meta/recipes-devtools/cve-check-tool/files/0001-print-progress-in-percent-when-downloading-CVE-db.patch > +++ /dev/null > @@ -1,135 +0,0 @@ > -From e9ed26cde63f8ca7607a010a518329339f8c02d3 Mon Sep 17 00:00:00 2001 > -From: =?UTF-8?q?Andr=C3=A9=20Draszik?= <git@andred.net> > -Date: Mon, 26 Sep 2016 12:12:41 +0100 > -Subject: [PATCH] print progress in percent when downloading CVE db > -MIME-Version: 1.0 > -Content-Type: text/plain; charset=UTF-8 > -Content-Transfer-Encoding: 8bit > - > -Upstream-Status: Pending > -Signed-off-by: André Draszik <git@andred.net> > ---- > - src/library/fetch.c | 28 +++++++++++++++++++++++++++- > - src/library/fetch.h | 3 ++- > - src/update.c | 16 ++++++++++++---- > - 3 files changed, 41 insertions(+), 6 deletions(-) > - > -diff --git a/src/library/fetch.c b/src/library/fetch.c > -index 06d4b30..0fe6d76 100644 > ---- a/src/library/fetch.c > -+++ b/src/library/fetch.c > -@@ -37,13 +37,37 @@ static size_t write_func(void *ptr, size_t size, > size_t nmemb, struct fetch_t *f > - return fwrite(ptr, size, nmemb, f->f); > - } > - > --FetchStatus fetch_uri(const char *uri, const char *target, bool verbose) > -+struct percent_t { > -+ unsigned int start; > -+ unsigned int end; > -+}; > -+ > -+static int progress_callback_new(void *ptr, curl_off_t dltotal, > curl_off_t dlnow, curl_off_t ultotal, curl_off_t ulnow) > -+{ > -+ (void) ultotal; > -+ (void) ulnow; > -+ > -+ struct percent_t *percent = (struct percent_t *) ptr; > -+ > -+ if (dltotal && percent && percent->end >= percent->start) { > -+ unsigned int diff = percent->end - percent->start; > -+ if (diff) { > -+ fprintf(stderr,"completed: > %"CURL_FORMAT_CURL_OFF_T"%%\r", percent->start + (diff * dlnow / dltotal)); > -+ } > -+ } > -+ > -+ return 0; > -+} > -+ > -+FetchStatus fetch_uri(const char *uri, const char *target, bool verbose, > -+ unsigned int start_percent, unsigned int > end_percent) > - { > - FetchStatus ret = FETCH_STATUS_FAIL; > - CURLcode res; > - struct stat st; > - CURL *curl = NULL; > - struct fetch_t *f = NULL; > -+ struct percent_t percent = { .start = start_percent, .end = > end_percent }; > - > - curl = curl_easy_init(); > - if (!curl) { > -@@ -67,6 +91,8 @@ FetchStatus fetch_uri(const char *uri, const char > *target, bool verbose) > - } > - if (verbose) { > - (void)curl_easy_setopt(curl, CURLOPT_NOPROGRESS, 0L); > -+ (void)curl_easy_setopt(curl, CURLOPT_XFERINFODATA, > &percent); > -+ (void)curl_easy_setopt(curl, CURLOPT_XFERINFOFUNCTION, > progress_callback_new); > - } > - res = curl_easy_setopt(curl, CURLOPT_WRITEFUNCTION, > (curl_write_callback)write_func); > - if (res != CURLE_OK) { > -diff --git a/src/library/fetch.h b/src/library/fetch.h > -index 70c3779..4cce5d1 100644 > ---- a/src/library/fetch.h > -+++ b/src/library/fetch.h > -@@ -28,7 +28,8 @@ typedef enum { > - * @param verbose Whether to be verbose > - * @return A FetchStatus, indicating the operation taken > - */ > --FetchStatus fetch_uri(const char *uri, const char *target, bool verbose); > -+FetchStatus fetch_uri(const char *uri, const char *target, bool verbose, > -+ unsigned int this_percent, unsigned int > next_percent); > - > - /** > - * Attempt to extract the given gzipped file > -diff --git a/src/update.c b/src/update.c > -index 30fbe96..eaeeefd 100644 > ---- a/src/update.c > -+++ b/src/update.c > -@@ -266,7 +266,8 @@ static inline void update_end(int fd, const char > *update_fname, bool ok) > - } > - > - static int do_fetch_update(int year, const char *db_dir, CveDB *cve_db, > -- bool db_exist, bool verbose) > -+ bool db_exist, bool verbose, > -+ unsigned int this_percent, unsigned int > next_percent) > - { > - const char nvd_uri[] = URI_PREFIX; > - autofree(cve_string) *uri_meta = NULL; > -@@ -330,14 +331,14 @@ refetch: > - } > - > - /* Fetch NVD META file */ > -- st = fetch_uri(uri_meta->str, nvdcve_meta->str, verbose); > -+ st = fetch_uri(uri_meta->str, nvdcve_meta->str, verbose, > this_percent, this_percent); > - if (st == FETCH_STATUS_FAIL) { > - fprintf(stderr, "Failed to fetch %s\n", uri_meta->str); > - return -1; > - } > - > - /* Fetch NVD XML file */ > -- st = fetch_uri(uri_data_gz->str, nvdcve_data_gz->str, verbose); > -+ st = fetch_uri(uri_data_gz->str, nvdcve_data_gz->str, verbose, > this_percent, next_percent); > - switch (st) { > - case FETCH_STATUS_FAIL: > - fprintf(stderr, "Failed to fetch %s\n", > uri_data_gz->str); > -@@ -459,10 +460,17 @@ bool update_db(bool quiet, const char *db_file) > - for (int i = YEAR_START; i <= year+1; i++) { > - int y = i > year ? -1 : i; > - int rc; > -+ unsigned int start_percent = ((i+0 - YEAR_START) * 100) > / (year+2 - YEAR_START); > -+ unsigned int end_percent = ((i+1 - YEAR_START) * 100) / > (year+2 - YEAR_START); > - > -- rc = do_fetch_update(y, db_dir, cve_db, db_exist, > !quiet); > -+ if (!quiet) > -+ fprintf(stderr, "completed: %u%%\r", > start_percent); > -+ rc = do_fetch_update(y, db_dir, cve_db, db_exist, !quiet, > -+ start_percent, end_percent); > - switch (rc) { > - case 0: > -+ if (!quiet) > -+ fprintf(stderr,"completed: %u%%\r", > end_percent); > - continue; > - case ENOMEM: > - goto oom; > --- > -2.9.3 > - > diff --git > a/meta/recipes-devtools/cve-check-tool/files/0001-update-Compare-computed-vs-expected-sha256-digit-str.patch > b/meta/recipes-devtools/cve-check-tool/files/0001-update-Compare-computed-vs-expected-sha256-digit-str.patch > deleted file mode 100644 > index 458c0cc84e5..00000000000 > --- > a/meta/recipes-devtools/cve-check-tool/files/0001-update-Compare-computed-vs-expected-sha256-digit-str.patch > +++ /dev/null > @@ -1,52 +0,0 @@ > -From b0426e63c9ac61657e029f689bcb8dd051e752c6 Mon Sep 17 00:00:00 2001 > -From: Sergey Popovich <popovich_sergei@mail.ua> > -Date: Fri, 21 Apr 2017 07:32:23 -0700 > -Subject: [PATCH] update: Compare computed vs expected sha256 digit string > - ignoring case > - > -We produce sha256 digest string using %x snprintf() > -qualifier for each byte of digest which uses alphabetic > -characters from "a" to "f" in lower case to represent > -integer values from 10 to 15. > - > -Previously all of the NVD META files supply sha256 > -digest string for corresponding XML file in lower case. > - > -However due to some reason this changed recently to > -provide digest digits in upper case causing fetched > -data consistency checks to fail. This prevents database > -from being updated periodically. > - > -While commit c4f6e94 (update: Do not treat sha256 failure > -as fatal if requested) adds useful option to skip > -digest validation at all and thus provides workaround for > -this situation, it might be unacceptable for some > -deployments where we need to ensure that downloaded > -data is consistent before start parsing it and update > -SQLite database. > - > -Use strcasecmp() to compare two digest strings case > -insensitively and addressing this case. > - > -Upstream-Status: Backport > -Signed-off-by: Sergey Popovich <popovich_sergei@mail.ua> > ---- > - src/update.c | 2 +- > - 1 file changed, 1 insertion(+), 1 deletion(-) > - > -diff --git a/src/update.c b/src/update.c > -index 8588f38..3cc6b67 100644 > ---- a/src/update.c > -+++ b/src/update.c > -@@ -187,7 +187,7 @@ static bool nvdcve_data_ok(const char *meta, const > char *data) > - snprintf(&csum_data[idx], len, "%02hhx", digest[i]); > - } > - > -- ret = streq(csum_meta, csum_data); > -+ ret = !strcasecmp(csum_meta, csum_data); > - > - err_unmap: > - munmap(buffer, length); > --- > -2.11.0 > - > diff --git > a/meta/recipes-devtools/cve-check-tool/files/check-for-malloc_trim-before-using-it.patch > b/meta/recipes-devtools/cve-check-tool/files/check-for-malloc_trim-before-using-it.patch > deleted file mode 100644 > index 0774ad946a4..00000000000 > --- > a/meta/recipes-devtools/cve-check-tool/files/check-for-malloc_trim-before-using-it.patch > +++ /dev/null > @@ -1,51 +0,0 @@ > -From ce64633b9733e962b8d8482244301f614d8b5845 Mon Sep 17 00:00:00 2001 > -From: Khem Raj <raj.khem@gmail.com> > -Date: Mon, 22 Aug 2016 22:54:24 -0700 > -Subject: [PATCH] Check for malloc_trim before using it > - > -malloc_trim is gnu specific and not all libc > -implement it, threfore write a configure check > -to poke for it first and use the define to > -guard its use. > - > -Helps in compiling on musl based systems > - > -Signed-off-by: Khem Raj <raj.khem@gmail.com> > ---- > -Upstream-Status: Submitted [ > https://github.com/ikeydoherty/cve-check-tool/pull/48] > - configure.ac | 2 ++ > - src/core.c | 4 ++-- > - 2 files changed, 4 insertions(+), 2 deletions(-) > - > -diff --git a/configure.ac b/configure.ac > -index d3b66ce..79c3542 100644 > ---- a/configure.ac > -+++ b/configure.ac > -@@ -19,6 +19,8 @@ m4_define([json_required_version], [0.16.0]) > - m4_define([openssl_required_version],[1.0.0]) > - # TODO: Set minimum sqlite > - > -+AC_CHECK_FUNCS_ONCE(malloc_trim) > -+ > - PKG_CHECK_MODULES(CVE_CHECK_TOOL, > - [ > - glib-2.0 >= glib_required_version, > -diff --git a/src/core.c b/src/core.c > -index 6263031..0d5df29 100644 > ---- a/src/core.c > -+++ b/src/core.c > -@@ -498,9 +498,9 @@ bool cve_db_load(CveDB *self, const char *fname) > - } > - > - b = true; > -- > -+#ifdef HAVE_MALLOC_TRIM > - malloc_trim(0); > -- > -+#endif > - xmlFreeTextReader(r); > - if (fd) { > - close(fd); > --- > -2.9.3 > - > -- > 2.20.1 > > -- > _______________________________________________ > Openembedded-core mailing list > Openembedded-core@lists.openembedded.org > http://lists.openembedded.org/mailman/listinfo/openembedded-core > <div dir="ltr"><div>Hi Ross/Richard,</div><div><br></div><div>I'd like this applied to Sumo also. Should I create a new patch and send it to the list, or is there a process for requesting this is cherry-picked across?</div><div><br></div><div>Thanks,</div><div>Ryan.</div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Wed, 25 Sep 2019 at 13:24, Ross Burton <<a href="mailto:ross.burton@intel.com">ross.burton@intel.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">As detailed at [1] the XML feeds provided by NIST are being discontinued on<br> October 9th 2019. As cve-check-tool uses these feeds, cve-check.bbclass will be<br> inoperable after this date.<br> <br> To ensure that cve-check continues working, backport the following commits from<br> master to move away from the unmaintained cve-check-tool to our own Python code<br> that fetches the JSON:<br> <br> 546d14135c5 cve-update-db: New recipe to update CVE database<br> bc144b028f6 cve-check: Remove dependency to cve-check-tool-native<br> 7f62a20b32a cve-check: Manage CVE_PRODUCT with more than one name<br> 3bf63bc6084 cve-check: Consider CVE that affects versions with less than operator<br> c0eabd30d7b cve-update-db: Use std library instead of urllib3<br> 27eb839ee65 cve-check: be idiomatic<br> 09be21f4d17 cve-update-db: Manage proxy if needed.<br> 975793e3825 cve-update-db: do_populate_cve_db depends on do_fetch<br> 0325dd72714 cve-update-db: Catch request.urlopen errors.<br> 4078da92b49 cve-check: Depends on cve-update-db-native<br> f7676e9a38d cve-update-db: Use NVD CPE data to populate PRODUCTS table<br> bc0195be1b1 cve-check: Update unpatched CVE matching<br> c807c2a6409 cve-update-db-native: Skip recipe when cve-check class is not loaded.<br> 07bb8b25e17 cve-check: remove redundant readline CVE whitelisting<br> 5388ed6d137 cve-check-tool: remove<br> 270ac00cb43 cve-check.bbclass: initialize to_append<br> e6bf9000987 cve-check: allow comparison of Vendor as well as Product<br> 91770338f76 cve-update-db-native: use SQL placeholders instead of format strings<br> 7069302a4cc cve-check: Replace CVE_CHECK_CVE_WHITELIST by CVE_CHECK_WHITELIST<br> 78de2cb39d7 cve-update-db-native: Remove hash column from database.<br> 4b301030cf9 cve-update-db-native: use os.path.join instead of +<br> f0d822fad2a cve-update-db: actually inherit native<br> b309840b6aa cve-update-db-native: use executemany() to optimise CPE insertion<br> bb4e53af33d cve-update-db-native: improve metadata parsing<br> 94227459792 cve-update-db-native: clean up JSON fetching<br> 95438d52b73 cve-update-db-native: fix https proxy issues<br> 1f9a963b9ff glibc: exclude child recipes from CVE scanning<br> <br> [1] <a href="https://nvd.nist.gov/General/News/XML-Vulnerability-Feed-Retirement" rel="noreferrer" target="_blank">https://nvd.nist.gov/General/News/XML-Vulnerability-Feed-Retirement</a><br> <br> Signed-off-by: Ross Burton <<a href="mailto:ross.burton@intel.com" target="_blank">ross.burton@intel.com</a>><br> ---<br>  meta/classes/cve-check.bbclass        | 142 +++++++-----<br>  meta/conf/distro/include/maintainers.inc   |  1 +<br>  meta/recipes-core/glibc/glibc-locale.inc   |  3 +<br>  meta/recipes-core/glibc/glibc-mtrace.inc   |  3 +<br>  meta/recipes-core/glibc/glibc-scripts.inc   |  3 +<br>  .../recipes-core/meta/<a href="http://cve-update-db-native.bb" rel="noreferrer" target="_blank">cve-update-db-native.bb</a> | 195 ++++++++++++++++<br>  .../cve-check-tool/<a href="http://cve-check-tool_5.6.4.bb" rel="noreferrer" target="_blank">cve-check-tool_5.6.4.bb</a>  | 62 -----<br>  ...x-freeing-memory-allocated-by-sqlite.patch | 50 ----<br>  ...erriding-default-CA-certificate-file.patch | 215 ------------------<br>  ...s-in-percent-when-downloading-CVE-db.patch | 135 -----------<br>  ...omputed-vs-expected-sha256-digit-str.patch | 52 -----<br>  ...heck-for-malloc_trim-before-using-it.patch | 51 -----<br>  12 files changed, 292 insertions(+), 620 deletions(-)<br>  create mode 100644 meta/recipes-core/meta/<a href="http://cve-update-db-native.bb" rel="noreferrer" target="_blank">cve-update-db-native.bb</a><br>  delete mode 100644 meta/recipes-devtools/cve-check-tool/<a href="http://cve-check-tool_5.6.4.bb" rel="noreferrer" target="_blank">cve-check-tool_5.6.4.bb</a><br>  delete mode 100644 meta/recipes-devtools/cve-check-tool/files/0001-Fix-freeing-memory-allocated-by-sqlite.patch<br>  delete mode 100644 meta/recipes-devtools/cve-check-tool/files/0001-curl-allow-overriding-default-CA-certificate-file.patch<br>  delete mode 100644 meta/recipes-devtools/cve-check-tool/files/0001-print-progress-in-percent-when-downloading-CVE-db.patch<br>  delete mode 100644 meta/recipes-devtools/cve-check-tool/files/0001-update-Compare-computed-vs-expected-sha256-digit-str.patch<br>  delete mode 100644 meta/recipes-devtools/cve-check-tool/files/check-for-malloc_trim-before-using-it.patch<br> <br> diff --git a/meta/classes/cve-check.bbclass b/meta/classes/cve-check.bbclass<br> index 743bc08a4f9..c00d2910be1 100644<br> --- a/meta/classes/cve-check.bbclass<br> +++ b/meta/classes/cve-check.bbclass<br> @@ -26,7 +26,7 @@ CVE_PRODUCT ??= "${BPN}"<br>  CVE_VERSION ??= "${PV}"<br> <br>  CVE_CHECK_DB_DIR ?= "${DL_DIR}/CVE_CHECK"<br> -CVE_CHECK_DB_FILE ?= "${CVE_CHECK_DB_DIR}/nvd.db"<br> +CVE_CHECK_DB_FILE ?= "${CVE_CHECK_DB_DIR}/nvdcve_1.0.db"<br> <br>  CVE_CHECK_LOG ?= "${T}/cve.log"<br>  CVE_CHECK_TMP_FILE ?= "${TMPDIR}/cve_check"<br> @@ -37,32 +37,33 @@ CVE_CHECK_COPY_FILES ??= "1"<br>  CVE_CHECK_CREATE_MANIFEST ??= "1"<br> <br>  # Whitelist for packages (PN)<br> -CVE_CHECK_PN_WHITELIST = "\<br> -  glibc-locale \<br> -"<br> +CVE_CHECK_PN_WHITELIST ?= ""<br> <br> -# Whitelist for CVE and version of package<br> -CVE_CHECK_CVE_WHITELIST = "{\<br> -  'CVE-2014-2524': ('6.3','5.2',), \<br> -}"<br> +# Whitelist for CVE. If a CVE is found, then it is considered patched.<br> +# The value is a string containing space separated CVE values:<br> +# <br> +# CVE_CHECK_WHITELIST = 'CVE-2014-2524 CVE-2018-1234'<br> +# <br> +CVE_CHECK_WHITELIST ?= ""<br> <br>  python do_cve_check () {<br>    """<br>    Check recipe for patched and unpatched CVEs<br>    """<br> <br> -  if os.path.exists(d.getVar("CVE_CHECK_TMP_FILE")):<br> +  if os.path.exists(d.getVar("CVE_CHECK_DB_FILE")):<br>      patched_cves = get_patches_cves(d)<br>      patched, unpatched = check_cves(d, patched_cves)<br>      if patched or unpatched:<br>        cve_data = get_cve_info(d, patched + unpatched)<br>        cve_write_data(d, patched, unpatched, cve_data)<br>    else:<br> -    bb.note("Failed to update CVE database, skipping CVE check")<br> +    bb.note("No CVE database found, skipping CVE check")<br> +<br>  }<br> <br>  addtask cve_check after do_unpack before do_build<br> -do_cve_check[depends] = "cve-check-tool-native:do_populate_sysroot cve-check-tool-native:do_populate_cve_db"<br> +do_cve_check[depends] = "cve-update-db-native:do_populate_cve_db"<br>  do_cve_check[nostamp] = "1"<br> <br>  python cve_check_cleanup () {<br> @@ -163,65 +164,94 @@ def get_patches_cves(d):<br> <br>  def check_cves(d, patched_cves):<br>    """<br> -  Run cve-check-tool looking for patched and unpatched CVEs.<br> +  Connect to the NVD database and find unpatched cves.<br>    """<br> -<br>    import ast, csv, tempfile, subprocess, io<br> +  from distutils.version import LooseVersion<br> <br> -  cves_patched = []<br>    cves_unpatched = []<br> -  bpn = d.getVar("CVE_PRODUCT")<br> +  # CVE_PRODUCT can contain more than one product (eg. curl/libcurl)<br> +  products = d.getVar("CVE_PRODUCT").split()<br>    # If this has been unset then we're not scanning for CVEs here (for example, image recipes)<br> -  if not bpn:<br> +  if not products:<br>      return ([], [])<br>    pv = d.getVar("CVE_VERSION").split("+git")[0]<br> -  cves = " ".join(patched_cves)<br> -  cve_db_dir = d.getVar("CVE_CHECK_DB_DIR")<br> -  cve_whitelist = ast.literal_eval(d.getVar("CVE_CHECK_CVE_WHITELIST"))<br> -  cve_cmd = "cve-check-tool"<br> -  cmd = [cve_cmd, "--no-html", "--skip-update", "--csv", "--not-affected", "-t", "faux", "-d", cve_db_dir]<br> <br>    # If the recipe has been whitlisted we return empty lists<br>    if d.getVar("PN") in d.getVar("CVE_CHECK_PN_WHITELIST").split():<br>      bb.note("Recipe has been whitelisted, skipping check")<br>      return ([], [])<br> <br> -  try:<br> -    # Write the faux CSV file to be used with cve-check-tool<br> -    fd, faux = tempfile.mkstemp(prefix="cve-faux-")<br> -    with os.fdopen(fd, "w") as f:<br> -      for pn in bpn.split():<br> -        f.write("%s,%s,%s,\n" % (pn, pv, cves))<br> -    cmd.append(faux)<br> -<br> -    output = subprocess.check_output(cmd).decode("utf-8")<br> -    bb.debug(2, "Output of command %s:\n%s" % ("\n".join(cmd), output))<br> -  except subprocess.CalledProcessError as e:<br> -    bb.warn("Couldn't check for CVEs: %s (output %s)" % (e, e.output))<br> -  finally:<br> -    os.remove(faux)<br> -<br> -  for row in csv.reader(io.StringIO(output)):<br> -    # Third row has the unpatched CVEs<br> -    if row[2]:<br> -      for cve in row[2].split():<br> -        # Skip if the CVE has been whitlisted for the current version<br> -        if pv in cve_whitelist.get(cve,[]):<br> -          bb.note("%s-%s has been whitelisted for %s" % (bpn, pv, cve))<br> +  old_cve_whitelist = d.getVar("CVE_CHECK_CVE_WHITELIST")<br> +  if old_cve_whitelist:<br> +    bb.warn("CVE_CHECK_CVE_WHITELIST is deprecated, please use CVE_CHECK_WHITELIST.")<br> +  cve_whitelist = d.getVar("CVE_CHECK_WHITELIST").split()<br> +<br> +  import sqlite3<br> +  db_file = d.getVar("CVE_CHECK_DB_FILE")<br> +  conn = sqlite3.connect(db_file)<br> +<br> +  for product in products:<br> +    c = conn.cursor()<br> +    if ":" in product:<br> +      vendor, product = product.split(":", 1)<br> +      c.execute("SELECT * FROM PRODUCTS WHERE PRODUCT IS ? AND VENDOR IS ?", (product, vendor))<br> +    else:<br> +      c.execute("SELECT * FROM PRODUCTS WHERE PRODUCT IS ?", (product,))<br> +<br> +    for row in c:<br> +      cve = row[0]<br> +      version_start = row[3]<br> +      operator_start = row[4]<br> +      version_end = row[5]<br> +      operator_end = row[6]<br> +<br> +      if cve in cve_whitelist:<br> +        bb.note("%s-%s has been whitelisted for %s" % (product, pv, cve))<br> +      elif cve in patched_cves:<br> +        bb.note("%s has been patched" % (cve))<br> +      else:<br> +        to_append = False<br> +        if (operator_start == '=' and pv == version_start):<br> +          cves_unpatched.append(cve)<br>          else:<br> +          if operator_start:<br> +            try:<br> +              to_append_start = (operator_start == '>=' and LooseVersion(pv) >= LooseVersion(version_start))<br> +              to_append_start |= (operator_start == '>' and LooseVersion(pv) > LooseVersion(version_start))<br> +            except:<br> +              bb.note("%s: Failed to compare %s %s %s for %s" %<br> +                  (product, pv, operator_start, version_start, cve))<br> +              to_append_start = False<br> +          else:<br> +            to_append_start = False<br> +<br> +          if operator_end:<br> +            try:<br> +              to_append_end = (operator_end == '<=' and LooseVersion(pv) <= LooseVersion(version_end))<br> +              to_append_end |= (operator_end == '<' and LooseVersion(pv) < LooseVersion(version_end))<br> +            except:<br> +              bb.note("%s: Failed to compare %s %s %s for %s" %<br> +                  (product, pv, operator_end, version_end, cve))<br> +              to_append_end = False<br> +          else:<br> +            to_append_end = False<br> +<br> +          if operator_start and operator_end:<br> +            to_append = to_append_start and to_append_end<br> +          else:<br> +            to_append = to_append_start or to_append_end<br> +<br> +        if to_append:<br>            cves_unpatched.append(cve)<br> -          bb.debug(2, "%s-%s is not patched for %s" % (bpn, pv, cve))<br> -    # Fourth row has patched CVEs<br> -    if row[3]:<br> -      for cve in row[3].split():<br> -        cves_patched.append(cve)<br> -        bb.debug(2, "%s-%s is patched for %s" % (bpn, pv, cve))<br> +        bb.debug(2, "%s-%s is not patched for %s" % (product, pv, cve))<br> +  conn.close()<br> <br> -  return (cves_patched, cves_unpatched)<br> +  return (list(patched_cves), cves_unpatched)<br> <br>  def get_cve_info(d, cves):<br>    """<br> -  Get CVE information from the database used by cve-check-tool.<br> +  Get CVE information from the database.<br> <br>    Unfortunately the only way to get CVE info is set the output to<br>    html (hard to parse) or query directly the database.<br> @@ -241,9 +271,10 @@ def get_cve_info(d, cves):<br>    for row in cur.execute(query, tuple(cves)):<br>      cve_data[row[0]] = {}<br>      cve_data[row[0]]["summary"] = row[1]<br> -    cve_data[row[0]]["score"] = row[2]<br> -    cve_data[row[0]]["modified"] = row[3]<br> -    cve_data[row[0]]["vector"] = row[4]<br> +    cve_data[row[0]]["scorev2"] = row[2]<br> +    cve_data[row[0]]["scorev3"] = row[3]<br> +    cve_data[row[0]]["modified"] = row[4]<br> +    cve_data[row[0]]["vector"] = row[5]<br>    conn.close()<br> <br>    return cve_data<br> @@ -270,7 +301,8 @@ def cve_write_data(d, patched, unpatched, cve_data):<br>        unpatched_cves.append(cve)<br>        write_string += "CVE STATUS: Unpatched\n"<br>      write_string += "CVE SUMMARY: %s\n" % cve_data[cve]["summary"]<br> -    write_string += "CVSS v2 BASE SCORE: %s\n" % cve_data[cve]["score"]<br> +    write_string += "CVSS v2 BASE SCORE: %s\n" % cve_data[cve]["scorev2"]<br> +    write_string += "CVSS v3 BASE SCORE: %s\n" % cve_data[cve]["scorev3"]<br>      write_string += "VECTOR: %s\n" % cve_data[cve]["vector"]<br>      write_string += "MORE INFORMATION: %s%s\n\n" % (nvd_link, cve)<br> <br> diff --git a/meta/conf/distro/include/maintainers.inc b/meta/conf/distro/include/maintainers.inc<br> index 672f0677922..c027901fdf0 100644<br> --- a/meta/conf/distro/include/maintainers.inc<br> +++ b/meta/conf/distro/include/maintainers.inc<br> @@ -116,6 +116,7 @@ RECIPE_MAINTAINER_pn-cryptodev-tests = "Robert Yang <<a href="mailto:liezhi.yang@windriver.com" target="_blank">liezhi.yang@windriver.com</a>>"<br>  RECIPE_MAINTAINER_pn-cups = "Chen Qi <<a href="mailto:Qi.Chen@windriver.com" target="_blank">Qi.Chen@windriver.com</a>>"<br>  RECIPE_MAINTAINER_pn-curl = "Armin Kuster <<a href="mailto:akuster808@gmail.com" target="_blank">akuster808@gmail.com</a>>"<br>  RECIPE_MAINTAINER_pn-cve-check-tool = "Ross Burton <<a href="mailto:ross.burton@intel.com" target="_blank">ross.burton@intel.com</a>>"<br> +RECIPE_MAINTAINER_pn-cve-update-db-native = "Ross Burton <<a href="mailto:ross.burton@intel.com" target="_blank">ross.burton@intel.com</a>>"<br>  RECIPE_MAINTAINER_pn-cwautomacros = "Ross Burton <<a href="mailto:ross.burton@intel.com" target="_blank">ross.burton@intel.com</a>>"<br>  RECIPE_MAINTAINER_pn-db = "Mark Hatle <<a href="mailto:mark.hatle@windriver.com" target="_blank">mark.hatle@windriver.com</a>>"<br>  RECIPE_MAINTAINER_pn-dbus = "Chen Qi <<a href="mailto:Qi.Chen@windriver.com" target="_blank">Qi.Chen@windriver.com</a>>"<br> diff --git a/meta/recipes-core/glibc/glibc-locale.inc b/meta/recipes-core/glibc/glibc-locale.inc<br> index 1b676dc26e7..97d83cb856d 100644<br> --- a/meta/recipes-core/glibc/glibc-locale.inc<br> +++ b/meta/recipes-core/glibc/glibc-locale.inc<br> @@ -95,3 +95,6 @@ do_install () {<br>  inherit libc-package<br> <br>  BBCLASSEXTEND = "nativesdk"<br> +<br> +# Don't scan for CVEs as glibc will be scanned<br> +CVE_PRODUCT = ""<br> diff --git a/meta/recipes-core/glibc/glibc-mtrace.inc b/meta/recipes-core/glibc/glibc-mtrace.inc<br> index d703c14bdc1..ef9d60ec239 100644<br> --- a/meta/recipes-core/glibc/glibc-mtrace.inc<br> +++ b/meta/recipes-core/glibc/glibc-mtrace.inc<br> @@ -11,3 +11,6 @@ do_install() {<br>     install -d -m 0755 ${D}${bindir}<br>     install -m 0755 ${SRC}/mtrace ${D}${bindir}/<br>  }<br> +<br> +# Don't scan for CVEs as glibc will be scanned<br> +CVE_PRODUCT = ""<br> diff --git a/meta/recipes-core/glibc/glibc-scripts.inc b/meta/recipes-core/glibc/glibc-scripts.inc<br> index 2a2b41507ed..14a14e45126 100644<br> --- a/meta/recipes-core/glibc/glibc-scripts.inc<br> +++ b/meta/recipes-core/glibc/glibc-scripts.inc<br> @@ -18,3 +18,6 @@ do_install() {<br>  # sotruss script requires sotruss-lib.so (given by libsotruss package), <br>  # to produce trace of the library calls.<br>  RDEPENDS_${PN} += "libsotruss"<br> +<br> +# Don't scan for CVEs as glibc will be scanned<br> +CVE_PRODUCT = ""<br> diff --git a/meta/recipes-core/meta/<a href="http://cve-update-db-native.bb" rel="noreferrer" target="_blank">cve-update-db-native.bb</a> b/meta/recipes-core/meta/<a href="http://cve-update-db-native.bb" rel="noreferrer" target="_blank">cve-update-db-native.bb</a><br> new file mode 100644<br> index 00000000000..2c427a5884f<br> --- /dev/null<br> +++ b/meta/recipes-core/meta/<a href="http://cve-update-db-native.bb" rel="noreferrer" target="_blank">cve-update-db-native.bb</a><br> @@ -0,0 +1,195 @@<br> +SUMMARY = "Updates the NVD CVE database"<br> +LICENSE = "MIT"<br> +<br> +INHIBIT_DEFAULT_DEPS = "1"<br> +<br> +inherit native<br> +<br> +deltask do_unpack<br> +deltask do_patch<br> +deltask do_configure<br> +deltask do_compile<br> +deltask do_install<br> +deltask do_populate_sysroot<br> +<br> +python () {<br> +  if not d.getVar("CVE_CHECK_DB_FILE"):<br> +    raise bb.parse.SkipRecipe("Skip recipe when cve-check class is not loaded.")<br> +}<br> +<br> +python do_populate_cve_db() {<br> +  """<br> +  Update NVD database with json data feed<br> +  """<br> +<br> +  import sqlite3, urllib, urllib.parse, shutil, gzip<br> +  from datetime import date<br> +<br> +  BASE_URL = "<a href="https://nvd.nist.gov/feeds/json/cve/1.0/nvdcve-1.0-" rel="noreferrer" target="_blank">https://nvd.nist.gov/feeds/json/cve/1.0/nvdcve-1.0-</a>"<br> +  YEAR_START = 2002<br> +<br> +  db_dir = os.path.join(d.getVar("DL_DIR"), 'CVE_CHECK')<br> +  db_file = os.path.join(db_dir, 'nvdcve_1.0.db')<br> +  json_tmpfile = os.path.join(db_dir, 'nvd.json.gz')<br> +  proxy = d.getVar("https_proxy")<br> +<br> +  if proxy:<br> +    # instantiate an opener but do not install it as the global<br> +    # opener unless if we're really sure it's applicable for all<br> +    # urllib requests<br> +    proxy_handler = urllib.request.ProxyHandler({'https': proxy})<br> +    proxy_opener = urllib.request.build_opener(proxy_handler)<br> +  else:<br> +    proxy_opener = None<br> +<br> +  cve_f = open(os.path.join(d.getVar("TMPDIR"), 'cve_check'), 'a')<br> +<br> +  if not os.path.isdir(db_dir):<br> +    os.mkdir(db_dir)<br> +<br> +  # Connect to database<br> +  conn = sqlite3.connect(db_file)<br> +  c = conn.cursor()<br> +<br> +  initialize_db(c)<br> +<br> +  for year in range(YEAR_START, date.today().year + 1):<br> +    year_url = BASE_URL + str(year)<br> +    meta_url = year_url + ".meta"<br> +    json_url = year_url + ".json.gz"<br> +<br> +    # Retrieve meta last modified date<br> +<br> +    response = None<br> +<br> +    if proxy_opener:<br> +      response = proxy_opener.open(meta_url)<br> +    else:<br> +      req = urllib.request.Request(meta_url)<br> +      response = urllib.request.urlopen(req)<br> +<br> +    if response:<br> +      for l in response.read().decode("utf-8").splitlines():<br> +        key, value = l.split(":", 1)<br> +        if key == "lastModifiedDate":<br> +          last_modified = value<br> +          break<br> +      else:<br> +        bb.warn("Cannot parse CVE metadata, update failed")<br> +        return<br> +<br> +    # Compare with current db last modified date<br> +    c.execute("select DATE from META where YEAR = ?", (year,))<br> +    meta = c.fetchone()<br> +    if not meta or meta[0] != last_modified:<br> +      # Clear products table entries corresponding to current year<br> +      c.execute("delete from PRODUCTS where ID like ?", ('CVE-%d%%' % year,))<br> +<br> +      # Update db with current year json file<br> +      try:<br> +        if proxy_opener:<br> +          response = proxy_opener.open(json_url)<br> +        else:<br> +          req = urllib.request.Request(json_url)<br> +          response = urllib.request.urlopen(req)<br> +<br> +        if response:<br> +          update_db(c, gzip.decompress(response.read()).decode('utf-8'))<br> +        c.execute("insert or replace into META values (?, ?)", [year, last_modified])<br> +      except urllib.error.URLError as e:<br> +        cve_f.write('Warning: CVE db update error, CVE data is outdated.\n\n')<br> +        bb.warn("Cannot parse CVE data (%s), update failed" % e.reason)<br> +        return<br> +<br> +    # Update success, set the date to cve_check file.<br> +    if year == date.today().year:<br> +      cve_f.write('CVE database update : %s\n\n' % date.today())<br> +<br> +  cve_f.close()<br> +  conn.commit()<br> +  conn.close()<br> +}<br> +<br> +def initialize_db(c):<br> +  c.execute("CREATE TABLE IF NOT EXISTS META (YEAR INTEGER UNIQUE, DATE TEXT)")<br> +  c.execute("CREATE TABLE IF NOT EXISTS NVD (ID TEXT UNIQUE, SUMMARY TEXT, \<br> +    SCOREV2 TEXT, SCOREV3 TEXT, MODIFIED INTEGER, VECTOR TEXT)")<br> +  c.execute("CREATE TABLE IF NOT EXISTS PRODUCTS (ID TEXT, \<br> +    VENDOR TEXT, PRODUCT TEXT, VERSION_START TEXT, OPERATOR_START TEXT, \<br> +    VERSION_END TEXT, OPERATOR_END TEXT)")<br> +<br> +def parse_node_and_insert(c, node, cveId):<br> +  # Parse children node if needed<br> +  for child in node.get('children', ()):<br> +    parse_node_and_insert(c, child, cveId)<br> +<br> +  def cpe_generator():<br> +    for cpe in node.get('cpe_match', ()):<br> +      if not cpe['vulnerable']:<br> +        return<br> +      cpe23 = cpe['cpe23Uri'].split(':')<br> +      vendor = cpe23[3]<br> +      product = cpe23[4]<br> +      version = cpe23[5]<br> +<br> +      if version != '*':<br> +        # Version is defined, this is a '=' match<br> +        yield [cveId, vendor, product, version, '=', '', '']<br> +      else:<br> +        # Parse start version, end version and operators<br> +        op_start = ''<br> +        op_end = ''<br> +        v_start = ''<br> +        v_end = ''<br> +<br> +        if 'versionStartIncluding' in cpe:<br> +          op_start = '>='<br> +          v_start = cpe['versionStartIncluding']<br> +<br> +        if 'versionStartExcluding' in cpe:<br> +          op_start = '>'<br> +          v_start = cpe['versionStartExcluding']<br> +<br> +        if 'versionEndIncluding' in cpe:<br> +          op_end = '<='<br> +          v_end = cpe['versionEndIncluding']<br> +<br> +        if 'versionEndExcluding' in cpe:<br> +          op_end = '<'<br> +          v_end = cpe['versionEndExcluding']<br> +<br> +        yield [cveId, vendor, product, v_start, op_start, v_end, op_end]<br> +<br> +  c.executemany("insert into PRODUCTS values (?, ?, ?, ?, ?, ?, ?)", cpe_generator())<br> +<br> +def update_db(c, jsondata):<br> +  import json<br> +  root = json.loads(jsondata)<br> +<br> +  for elt in root['CVE_Items']:<br> +    if not elt['impact']:<br> +      continue<br> +<br> +    cveId = elt['cve']['CVE_data_meta']['ID']<br> +    cveDesc = elt['cve']['description']['description_data'][0]['value']<br> +    date = elt['lastModifiedDate']<br> +    accessVector = elt['impact']['baseMetricV2']['cvssV2']['accessVector']<br> +    cvssv2 = elt['impact']['baseMetricV2']['cvssV2']['baseScore']<br> +<br> +    try:<br> +      cvssv3 = elt['impact']['baseMetricV3']['cvssV3']['baseScore']<br> +    except:<br> +      cvssv3 = 0.0<br> +<br> +    c.execute("insert or replace into NVD values (?, ?, ?, ?, ?, ?)",<br> +        [cveId, cveDesc, cvssv2, cvssv3, date, accessVector])<br> +<br> +    configurations = elt['configurations']['nodes']<br> +    for config in configurations:<br> +      parse_node_and_insert(c, config, cveId)<br> +<br> +<br> +addtask do_populate_cve_db before do_fetch<br> +do_populate_cve_db[nostamp] = "1"<br> +<br> +EXCLUDE_FROM_WORLD = "1"<br> diff --git a/meta/recipes-devtools/cve-check-tool/<a href="http://cve-check-tool_5.6.4.bb" rel="noreferrer" target="_blank">cve-check-tool_5.6.4.bb</a> b/meta/recipes-devtools/cve-check-tool/<a href="http://cve-check-tool_5.6.4.bb" rel="noreferrer" target="_blank">cve-check-tool_5.6.4.bb</a><br> deleted file mode 100644<br> index 1c84fb1cf2d..00000000000<br> --- a/meta/recipes-devtools/cve-check-tool/<a href="http://cve-check-tool_5.6.4.bb" rel="noreferrer" target="_blank">cve-check-tool_5.6.4.bb</a><br> +++ /dev/null<br> @@ -1,62 +0,0 @@<br> -SUMMARY = "cve-check-tool"<br> -DESCRIPTION = "cve-check-tool is a tool for checking known (public) CVEs.\<br> -The tool will identify potentially vunlnerable software packages within Linux distributions through version matching."<br> -HOMEPAGE = "<a href="https://github.com/ikeydoherty/cve-check-tool" rel="noreferrer" target="_blank">https://github.com/ikeydoherty/cve-check-tool</a>"<br> -SECTION = "Development/Tools"<br> -LICENSE = "GPL-2.0+"<br> -LIC_FILES_CHKSUM = "file://LICENSE;md5=e8c1458438ead3c34974bc0be3a03ed6"<br> -<br> -SRC_URI = "<a href="https://github.com/ikeydoherty/$%7BBPN%7D/releases/download/v$%7BPV%7D/$%7BBP%7D.tar.xz" rel="noreferrer" target="_blank">https://github.com/ikeydoherty/${BPN}/releases/download/v${PV}/${BP}.tar.xz</a> \<br> -      file://check-for-malloc_trim-before-using-it.patch \<br> -      file://0001-print-progress-in-percent-when-downloading-CVE-db.patch \<br> -      file://0001-curl-allow-overriding-default-CA-certificate-file.patch \<br> -      file://0001-update-Compare-computed-vs-expected-sha256-digit-str.patch \<br> -      file://0001-Fix-freeing-memory-allocated-by-sqlite.patch \<br> -     "<br> -<br> -SRC_URI[md5sum] = "c5f4247140fc9be3bf41491d31a34155"<br> -SRC_URI[sha256sum] = "b8f283be718af8d31232ac1bfc10a0378fb958aaaa49af39168f8acf501e6a5b"<br> -<br> -UPSTREAM_CHECK_URI = "<a href="https://github.com/ikeydoherty/cve-check-tool/releases" rel="noreferrer" target="_blank">https://github.com/ikeydoherty/cve-check-tool/releases</a>"<br> -<br> -DEPENDS = "libcheck glib-2.0 json-glib curl libxml2 sqlite3 openssl ca-certificates"<br> -<br> -RDEPENDS_${PN} = "ca-certificates"<br> -<br> -inherit pkgconfig autotools<br> -<br> -EXTRA_OECONF = "--disable-coverage --enable-relative-plugins"<br> -CFLAGS_append = " -Wno-error=pedantic"<br> -<br> -do_populate_cve_db() {<br> -  if [ "${BB_NO_NETWORK}" = "1" ] ; then<br> -    bbwarn "BB_NO_NETWORK is set; Can't update cve-check-tool database, new CVEs won't be detected"<br> -    return<br> -  fi<br> -<br> -  # In case we don't inherit cve-check class, use default values defined in the class.<br> -  cve_dir="${CVE_CHECK_DB_DIR}"<br> -  cve_file="${CVE_CHECK_TMP_FILE}"<br> -<br> -  [ -z "${cve_dir}" ] && cve_dir="${DL_DIR}/CVE_CHECK"<br> -  [ -z "${cve_file}" ] && cve_file="${TMPDIR}/cve_check"<br> -<br> -  unused="${@bb.utils.export_proxies(d)}"<br> -  bbdebug 2 "Updating cve-check-tool database located in $cve_dir"<br> -  # --cacert works around curl-native not finding the CA bundle<br> -  if cve-check-update --cacert ${sysconfdir}/ssl/certs/ca-certificates.crt -d "$cve_dir" ; then<br> -    printf "CVE database was updated on %s UTC\n\n" "$(LANG=C date --utc +'%F %T')" > "$cve_file"<br> -  else<br> -    bbwarn "Error in executing cve-check-update"<br> -    if [ "${@'1' if bb.data.inherits_class('cve-check', d) else '0'}" -ne 0 ] ; then<br> -      bbwarn "Failed to update cve-check-tool database, CVEs won't be checked"<br> -    fi<br> -  fi<br> -}<br> -<br> -addtask populate_cve_db after do_populate_sysroot<br> -do_populate_cve_db[depends] = "cve-check-tool-native:do_populate_sysroot"<br> -do_populate_cve_db[nostamp] = "1"<br> -do_populate_cve_db[progress] = "percent"<br> -<br> -BBCLASSEXTEND = "native nativesdk"<br> diff --git a/meta/recipes-devtools/cve-check-tool/files/0001-Fix-freeing-memory-allocated-by-sqlite.patch b/meta/recipes-devtools/cve-check-tool/files/0001-Fix-freeing-memory-allocated-by-sqlite.patch<br> deleted file mode 100644<br> index 4a82cf2dded..00000000000<br> --- a/meta/recipes-devtools/cve-check-tool/files/0001-Fix-freeing-memory-allocated-by-sqlite.patch<br> +++ /dev/null<br> @@ -1,50 +0,0 @@<br> -From a3353429652f83bb8b0316500faa88fa2555542d Mon Sep 17 00:00:00 2001<br> -From: Peter Marko <<a href="mailto:peter.marko@siemens.com" target="_blank">peter.marko@siemens.com</a>><br> -Date: Thu, 13 Apr 2017 23:09:52 +0200<br> -Subject: [PATCH] Fix freeing memory allocated by sqlite<br> -<br> -Upstream-Status: Backport<br> -Signed-off-by: Peter Marko <<a href="mailto:peter.marko@siemens.com" target="_blank">peter.marko@siemens.com</a>><br> ----<br> - src/core.c | 8 ++++----<br> - 1 file changed, 4 insertions(+), 4 deletions(-)<br> -<br> -diff --git a/src/core.c b/src/core.c<br> -index 6263031..6788f16 100644<br> ---- a/src/core.c<br> -+++ b/src/core.c<br> -@@ -82,7 +82,7 @@ static bool ensure_table(CveDB *self)<br> -     rc = sqlite3_exec(self->db, query, NULL, NULL, &err);<br> -     if (rc != SQLITE_OK) {<br> -         fprintf(stderr, "ensure_table(): %s\n", err);<br> --        free(err);<br> -+        sqlite3_free(err);<br> -         return false;<br> -     }<br> -     <br> -@@ -91,7 +91,7 @@ static bool ensure_table(CveDB *self)<br> -     rc = sqlite3_exec(self->db, query, NULL, NULL, &err);<br> -     if (rc != SQLITE_OK) {<br> -         fprintf(stderr, "ensure_table(): %s\n", err);<br> --        free(err);<br> -+        sqlite3_free(err);<br> -         return false;<br> -     }<br> - <br> -@@ -99,11 +99,11 @@ static bool ensure_table(CveDB *self)<br> -     rc = sqlite3_exec(self->db, query, NULL, NULL, &err);<br> -     if (rc != SQLITE_OK) {<br> -         fprintf(stderr, "ensure_table(): %s\n", err);<br> --        free(err);<br> -+        sqlite3_free(err);<br> -         return false;<br> -     }<br> -     if (err) {<br> --        free(err);<br> -+        sqlite3_free(err);<br> -     }<br> - <br> -     return true;<br> --- <br> -2.1.4<br> -<br> diff --git a/meta/recipes-devtools/cve-check-tool/files/0001-curl-allow-overriding-default-CA-certificate-file.patch b/meta/recipes-devtools/cve-check-tool/files/0001-curl-allow-overriding-default-CA-certificate-file.patch<br> deleted file mode 100644<br> index 3d8ebd1bd26..00000000000<br> --- a/meta/recipes-devtools/cve-check-tool/files/0001-curl-allow-overriding-default-CA-certificate-file.patch<br> +++ /dev/null<br> @@ -1,215 +0,0 @@<br> -From 825a9969dea052b02ba868bdf39e676349f10dce Mon Sep 17 00:00:00 2001<br> -From: Jussi Kukkonen <<a href="mailto:jussi.kukkonen@intel.com" target="_blank">jussi.kukkonen@intel.com</a>><br> -Date: Thu, 9 Feb 2017 14:51:28 +0200<br> -Subject: [PATCH] curl: allow overriding default CA certificate file<br> -<br> -Similar to curl, --cacert can now be used in cve-check-tool and<br> -cve-check-update to override the default CA certificate file. Useful<br> -in cases where the system default is unsuitable (for example,<br> -out-dated) or broken (as in OE's current native libcurl, which embeds<br> -a path string from one build host and then uses it on another although<br> -the right path may have become something different).<br> -<br> -Upstream-Status: Submitted [<a href="https://github.com/ikeydoherty/cve-check-tool/pull/45" rel="noreferrer" target="_blank">https://github.com/ikeydoherty/cve-check-tool/pull/45</a>]<br> -<br> -Signed-off-by: Patrick Ohly <<a href="mailto:patrick.ohly@intel.com" target="_blank">patrick.ohly@intel.com</a>><br> -<br> -<br> -Took Patrick Ohlys original patch from meta-security-isafw, rebased<br> -on top of other patches.<br> -<br> -Signed-off-by: Jussi Kukkonen <<a href="mailto:jussi.kukkonen@intel.com" target="_blank">jussi.kukkonen@intel.com</a>><br> ----<br> - src/library/cve-check-tool.h | 1 +<br> - src/library/fetch.c     | 10 +++++++++-<br> - src/library/fetch.h     | 3 ++-<br> - src/main.c          | 5 ++++-<br> - src/update-main.c      | 4 +++-<br> - src/update.c         | 12 +++++++-----<br> - src/update.h         | 2 +-<br> - 7 files changed, 27 insertions(+), 10 deletions(-)<br> -<br> -diff --git a/src/library/cve-check-tool.h b/src/library/cve-check-tool.h<br> -index e4bb5b1..f89eade 100644<br> ---- a/src/library/cve-check-tool.h<br> -+++ b/src/library/cve-check-tool.h<br> -@@ -43,6 +43,7 @@ typedef struct CveCheckTool {<br> -   bool bugs;             /**<Whether bug tracking is enabled */<br> -   GHashTable *mapping;        /**<CVE Mapping */<br> -   const char *output_file;      /**<Output file, if any */<br> -+  const char *cacert_file;      /**<Non-default SSL certificate file, if any */<br> - } CveCheckTool;<br> - <br> - /**<br> -diff --git a/src/library/fetch.c b/src/library/fetch.c<br> -index 0fe6d76..8f998c3 100644<br> ---- a/src/library/fetch.c<br> -+++ b/src/library/fetch.c<br> -@@ -60,7 +60,8 @@ static int progress_callback_new(void *ptr, curl_off_t dltotal, curl_off_t dlnow<br> - }<br> - <br> - FetchStatus fetch_uri(const char *uri, const char *target, bool verbose,<br> --           unsigned int start_percent, unsigned int end_percent)<br> -+           unsigned int start_percent, unsigned int end_percent,<br> -+           const char *cacert_file)<br> - {<br> -     FetchStatus ret = FETCH_STATUS_FAIL;<br> -     CURLcode res;<br> -@@ -74,6 +75,13 @@ FetchStatus fetch_uri(const char *uri, const char *target, bool verbose,<br> -         return ret;<br> -     }<br> - <br> -+    if (cacert_file) {<br> -+        res = curl_easy_setopt(curl, CURLOPT_CAINFO, cacert_file);<br> -+        if (res != CURLE_OK) {<br> -+            goto bail;<br> -+        }<br> -+    }<br> -+<br> -     if (stat(target, &st) == 0) {<br> -         res = curl_easy_setopt(curl, CURLOPT_TIMECONDITION, CURL_TIMECOND_IFMODSINCE);<br> -         if (res != CURLE_OK) {<br> -diff --git a/src/library/fetch.h b/src/library/fetch.h<br> -index 4cce5d1..836c7d7 100644<br> ---- a/src/library/fetch.h<br> -+++ b/src/library/fetch.h<br> -@@ -29,7 +29,8 @@ typedef enum {<br> - * @return A FetchStatus, indicating the operation taken<br> - */<br> - FetchStatus fetch_uri(const char *uri, const char *target, bool verbose,<br> --           unsigned int this_percent, unsigned int next_percent);<br> -+           unsigned int this_percent, unsigned int next_percent,<br> -+           const char *cacert_file);<br> - <br> - /**<br> - * Attempt to extract the given gzipped file<br> -diff --git a/src/main.c b/src/main.c<br> -index 8e6f158..ae69d47 100644<br> ---- a/src/main.c<br> -+++ b/src/main.c<br> -@@ -280,6 +280,7 @@ static bool csv_mode = false;<br> - static char *modified_stamp = NULL;<br> - static gchar *mapping_file = NULL;<br> - static gchar *output_file = NULL;<br> -+static gchar *cacert_file = NULL;<br> - <br> - static GOptionEntry _entries[] = {<br> -     { "not-patched", 'n', 0, G_OPTION_ARG_NONE, &hide_patched, "Hide patched/addressed CVEs", NULL },<br> -@@ -294,6 +295,7 @@ static GOptionEntry _entries[] = {<br> -     { "csv", 'c', 0, G_OPTION_ARG_NONE, &csv_mode, "Output CSV formatted data only", NULL },<br> -     { "mapping", 'M', 0, G_OPTION_ARG_STRING, &mapping_file, "Path to a mapping file", NULL},<br> -     { "output-file", 'o', 0, G_OPTION_ARG_STRING, &output_file, "Path to the output file (output plugin specific)", NULL},<br> -+    { "cacert", 'C', 0, G_OPTION_ARG_STRING, &cacert_file, "Path to the combined SSL certificates file (system default is used if not set)", NULL},<br> -     { .short_name = 0 }<br> - };<br> - <br> -@@ -492,6 +494,7 @@ int main(int argc, char **argv)<br> - <br> -     quiet = csv_mode || !no_html;<br> -     self->output_file = output_file;<br> -+    self->cacert_file = cacert_file;<br> - <br> -     if (!csv_mode && self->output_file) {<br> -         quiet = false;<br> -@@ -530,7 +533,7 @@ int main(int argc, char **argv)<br> -         if (status) {<br> -             fprintf(stderr, "Update of db forced\n");<br> -             cve_db_unlock();<br> --            if (!update_db(quiet, db_path->str)) {<br> -+            if (!update_db(quiet, db_path->str, self->cacert_file)) {<br> -                 fprintf(stderr, "DB update failure\n");<br> -                 goto cleanup;<br> -             }<br> -diff --git a/src/update-main.c b/src/update-main.c<br> -index 2379cfa..c52d9d0 100644<br> ---- a/src/update-main.c<br> -+++ b/src/update-main.c<br> -@@ -43,11 +43,13 @@ the Free Software Foundation; either version 2 of the License, or\n\<br> - static gchar *nvds = NULL;<br> - static bool _show_version = false;<br> - static bool _quiet = false;<br> -+static const char *_cacert_file = NULL;<br> - <br> - static GOptionEntry _entries[] = {<br> -     { "nvd-dir", 'd', 0, G_OPTION_ARG_STRING, &nvds, "NVD directory in filesystem", NULL },<br> -     { "version", 'v', 0, G_OPTION_ARG_NONE, &_show_version, "Show version", NULL },<br> -     { "quiet", 'q', 0, G_OPTION_ARG_NONE, &_quiet, "Run silently", NULL },<br> -+    { "cacert", 'C', 0, G_OPTION_ARG_STRING, &_cacert_file, "Path to the combined SSL certificates file (system default is used if not set)", NULL},<br> -     { .short_name = 0 }<br> - };<br> - <br> -@@ -88,7 +90,7 @@ int main(int argc, char **argv)<br> -         goto end;<br> -     }<br> - <br> --    if (update_db(_quiet, db_path->str)) {<br> -+    if (update_db(_quiet, db_path->str, _cacert_file)) {<br> -         ret = EXIT_SUCCESS;<br> -     } else {<br> -         fprintf(stderr, "Failed to update database\n");<br> -diff --git a/src/update.c b/src/update.c<br> -index 070560a..8cb4a39 100644<br> ---- a/src/update.c<br> -+++ b/src/update.c<br> -@@ -267,7 +267,8 @@ static inline void update_end(int fd, const char *update_fname, bool ok)<br> - <br> - static int do_fetch_update(int year, const char *db_dir, CveDB *cve_db,<br> -              bool db_exist, bool verbose,<br> --              unsigned int this_percent, unsigned int next_percent)<br> -+              unsigned int this_percent, unsigned int next_percent,<br> -+              const char *cacert_file)<br> - {<br> -     const char nvd_uri[] = URI_PREFIX;<br> -     autofree(cve_string) *uri_meta = NULL;<br> -@@ -331,14 +332,14 @@ refetch:<br> -     }<br> - <br> -     /* Fetch NVD META file */<br> --    st = fetch_uri(uri_meta->str, nvdcve_meta->str, verbose, this_percent, this_percent);<br> -+    st = fetch_uri(uri_meta->str, nvdcve_meta->str, verbose, this_percent, this_percent, cacert_file);<br> -     if (st == FETCH_STATUS_FAIL) {<br> -         fprintf(stderr, "Failed to fetch %s\n", uri_meta->str);<br> -         return -1;<br> -     }<br> - <br> -     /* Fetch NVD XML file */<br> --    st = fetch_uri(uri_data_gz->str, nvdcve_data_gz->str, verbose, this_percent, next_percent);<br> -+    st = fetch_uri(uri_data_gz->str, nvdcve_data_gz->str, verbose, this_percent, next_percent, cacert_file);<br> -     switch (st) {<br> -     case FETCH_STATUS_FAIL:<br> -         fprintf(stderr, "Failed to fetch %s\n", uri_data_gz->str);<br> -@@ -391,7 +392,7 @@ refetch:<br> -     return 0;<br> - }<br> - <br> --bool update_db(bool quiet, const char *db_file)<br> -+bool update_db(bool quiet, const char *db_file, const char *cacert_file)<br> - {<br> -     autofree(char) *db_dir = NULL;<br> -     autofree(CveDB) *cve_db = NULL;<br> -@@ -466,7 +467,8 @@ bool update_db(bool quiet, const char *db_file)<br> -         if (!quiet)<br> -             fprintf(stderr, "completed: %u%%\r", start_percent);<br> -         rc = do_fetch_update(y, db_dir, cve_db, db_exist, !quiet,<br> --                   start_percent, end_percent);<br> -+                   start_percent, end_percent,<br> -+                   cacert_file);<br> -         switch (rc) {<br> -         case 0:<br> -             if (!quiet)<br> -diff --git a/src/update.h b/src/update.h<br> -index b8e9911..ceea0c3 100644<br> ---- a/src/update.h<br> -+++ b/src/update.h<br> -@@ -15,7 +15,7 @@ cve_string *get_db_path(const char *path);<br> - <br> - int update_required(const char *db_file);<br> - <br> --bool update_db(bool quiet, const char *db_file);<br> -+bool update_db(bool quiet, const char *db_file, const char *cacert_file);<br> - <br> - <br> - /*<br> --- <br> -2.1.4<br> -<br> diff --git a/meta/recipes-devtools/cve-check-tool/files/0001-print-progress-in-percent-when-downloading-CVE-db.patch b/meta/recipes-devtools/cve-check-tool/files/0001-print-progress-in-percent-when-downloading-CVE-db.patch<br> deleted file mode 100644<br> index 8ea6f686e3f..00000000000<br> --- a/meta/recipes-devtools/cve-check-tool/files/0001-print-progress-in-percent-when-downloading-CVE-db.patch<br> +++ /dev/null<br> @@ -1,135 +0,0 @@<br> -From e9ed26cde63f8ca7607a010a518329339f8c02d3 Mon Sep 17 00:00:00 2001<br> -From: =?UTF-8?q?Andr=C3=A9=20Draszik?= <<a href="mailto:git@andred.net" target="_blank">git@andred.net</a>><br> -Date: Mon, 26 Sep 2016 12:12:41 +0100<br> -Subject: [PATCH] print progress in percent when downloading CVE db<br> -MIME-Version: 1.0<br> -Content-Type: text/plain; charset=UTF-8<br> -Content-Transfer-Encoding: 8bit<br> -<br> -Upstream-Status: Pending<br> -Signed-off-by: André Draszik <<a href="mailto:git@andred.net" target="_blank">git@andred.net</a>><br> ----<br> - src/library/fetch.c | 28 +++++++++++++++++++++++++++-<br> - src/library/fetch.h | 3 ++-<br> - src/update.c    | 16 ++++++++++++----<br> - 3 files changed, 41 insertions(+), 6 deletions(-)<br> -<br> -diff --git a/src/library/fetch.c b/src/library/fetch.c<br> -index 06d4b30..0fe6d76 100644<br> ---- a/src/library/fetch.c<br> -+++ b/src/library/fetch.c<br> -@@ -37,13 +37,37 @@ static size_t write_func(void *ptr, size_t size, size_t nmemb, struct fetch_t *f<br> -     return fwrite(ptr, size, nmemb, f->f);<br> - }<br> - <br> --FetchStatus fetch_uri(const char *uri, const char *target, bool verbose)<br> -+struct percent_t {<br> -+    unsigned int start;<br> -+    unsigned int end;<br> -+};<br> -+<br> -+static int progress_callback_new(void *ptr, curl_off_t dltotal, curl_off_t dlnow, curl_off_t ultotal, curl_off_t ulnow)<br> -+{<br> -+    (void) ultotal;<br> -+    (void) ulnow;<br> -+<br> -+    struct percent_t *percent = (struct percent_t *) ptr;<br> -+<br> -+    if (dltotal && percent && percent->end >= percent->start) {<br> -+        unsigned int diff = percent->end - percent->start;<br> -+        if (diff) {<br> -+            fprintf(stderr,"completed: %"CURL_FORMAT_CURL_OFF_T"%%\r", percent->start + (diff * dlnow / dltotal));<br> -+        }<br> -+    }<br> -+<br> -+    return 0;<br> -+}<br> -+<br> -+FetchStatus fetch_uri(const char *uri, const char *target, bool verbose,<br> -+           unsigned int start_percent, unsigned int end_percent)<br> - {<br> -     FetchStatus ret = FETCH_STATUS_FAIL;<br> -     CURLcode res;<br> -     struct stat st;<br> -     CURL *curl = NULL;<br> -     struct fetch_t *f = NULL;<br> -+    struct percent_t percent = { .start = start_percent, .end = end_percent };<br> - <br> -     curl = curl_easy_init();<br> -     if (!curl) {<br> -@@ -67,6 +91,8 @@ FetchStatus fetch_uri(const char *uri, const char *target, bool verbose)<br> -     }<br> -     if (verbose) {<br> -         (void)curl_easy_setopt(curl, CURLOPT_NOPROGRESS, 0L);<br> -+        (void)curl_easy_setopt(curl, CURLOPT_XFERINFODATA, &percent);<br> -+        (void)curl_easy_setopt(curl, CURLOPT_XFERINFOFUNCTION, progress_callback_new);<br> -     }<br> -     res = curl_easy_setopt(curl, CURLOPT_WRITEFUNCTION, (curl_write_callback)write_func);<br> -     if (res != CURLE_OK) {<br> -diff --git a/src/library/fetch.h b/src/library/fetch.h<br> -index 70c3779..4cce5d1 100644<br> ---- a/src/library/fetch.h<br> -+++ b/src/library/fetch.h<br> -@@ -28,7 +28,8 @@ typedef enum {<br> - * @param verbose Whether to be verbose<br> - * @return A FetchStatus, indicating the operation taken<br> - */<br> --FetchStatus fetch_uri(const char *uri, const char *target, bool verbose);<br> -+FetchStatus fetch_uri(const char *uri, const char *target, bool verbose,<br> -+           unsigned int this_percent, unsigned int next_percent);<br> - <br> - /**<br> - * Attempt to extract the given gzipped file<br> -diff --git a/src/update.c b/src/update.c<br> -index 30fbe96..eaeeefd 100644<br> ---- a/src/update.c<br> -+++ b/src/update.c<br> -@@ -266,7 +266,8 @@ static inline void update_end(int fd, const char *update_fname, bool ok)<br> - }<br> - <br> - static int do_fetch_update(int year, const char *db_dir, CveDB *cve_db,<br> --              bool db_exist, bool verbose)<br> -+              bool db_exist, bool verbose,<br> -+              unsigned int this_percent, unsigned int next_percent)<br> - {<br> -     const char nvd_uri[] = URI_PREFIX;<br> -     autofree(cve_string) *uri_meta = NULL;<br> -@@ -330,14 +331,14 @@ refetch:<br> -     }<br> - <br> -     /* Fetch NVD META file */<br> --    st = fetch_uri(uri_meta->str, nvdcve_meta->str, verbose);<br> -+    st = fetch_uri(uri_meta->str, nvdcve_meta->str, verbose, this_percent, this_percent);<br> -     if (st == FETCH_STATUS_FAIL) {<br> -         fprintf(stderr, "Failed to fetch %s\n", uri_meta->str);<br> -         return -1;<br> -     }<br> - <br> -     /* Fetch NVD XML file */<br> --    st = fetch_uri(uri_data_gz->str, nvdcve_data_gz->str, verbose);<br> -+    st = fetch_uri(uri_data_gz->str, nvdcve_data_gz->str, verbose, this_percent, next_percent);<br> -     switch (st) {<br> -     case FETCH_STATUS_FAIL:<br> -         fprintf(stderr, "Failed to fetch %s\n", uri_data_gz->str);<br> -@@ -459,10 +460,17 @@ bool update_db(bool quiet, const char *db_file)<br> -     for (int i = YEAR_START; i <= year+1; i++) {<br> -         int y = i > year ? -1 : i;<br> -         int rc;<br> -+        unsigned int start_percent = ((i+0 - YEAR_START) * 100) / (year+2 - YEAR_START);<br> -+        unsigned int end_percent = ((i+1 - YEAR_START) * 100) / (year+2 - YEAR_START);<br> - <br> --        rc = do_fetch_update(y, db_dir, cve_db, db_exist, !quiet);<br> -+        if (!quiet)<br> -+            fprintf(stderr, "completed: %u%%\r", start_percent);<br> -+        rc = do_fetch_update(y, db_dir, cve_db, db_exist, !quiet,<br> -+                   start_percent, end_percent);<br> -         switch (rc) {<br> -         case 0:<br> -+            if (!quiet)<br> -+                fprintf(stderr,"completed: %u%%\r", end_percent);<br> -             continue;<br> -         case ENOMEM:<br> -             goto oom;<br> --- <br> -2.9.3<br> -<br> diff --git a/meta/recipes-devtools/cve-check-tool/files/0001-update-Compare-computed-vs-expected-sha256-digit-str.patch b/meta/recipes-devtools/cve-check-tool/files/0001-update-Compare-computed-vs-expected-sha256-digit-str.patch<br> deleted file mode 100644<br> index 458c0cc84e5..00000000000<br> --- a/meta/recipes-devtools/cve-check-tool/files/0001-update-Compare-computed-vs-expected-sha256-digit-str.patch<br> +++ /dev/null<br> @@ -1,52 +0,0 @@<br> -From b0426e63c9ac61657e029f689bcb8dd051e752c6 Mon Sep 17 00:00:00 2001<br> -From: Sergey Popovich <<a href="mailto:popovich_sergei@mail.ua" target="_blank">popovich_sergei@mail.ua</a>><br> -Date: Fri, 21 Apr 2017 07:32:23 -0700<br> -Subject: [PATCH] update: Compare computed vs expected sha256 digit string<br> - ignoring case<br> -<br> -We produce sha256 digest string using %x snprintf()<br> -qualifier for each byte of digest which uses alphabetic<br> -characters from "a" to "f" in lower case to represent<br> -integer values from 10 to 15.<br> -<br> -Previously all of the NVD META files supply sha256<br> -digest string for corresponding XML file in lower case.<br> -<br> -However due to some reason this changed recently to<br> -provide digest digits in upper case causing fetched<br> -data consistency checks to fail. This prevents database<br> -from being updated periodically.<br> -<br> -While commit c4f6e94 (update: Do not treat sha256 failure<br> -as fatal if requested) adds useful option to skip<br> -digest validation at all and thus provides workaround for<br> -this situation, it might be unacceptable for some<br> -deployments where we need to ensure that downloaded<br> -data is consistent before start parsing it and update<br> -SQLite database.<br> -<br> -Use strcasecmp() to compare two digest strings case<br> -insensitively and addressing this case.<br> -<br> -Upstream-Status: Backport<br> -Signed-off-by: Sergey Popovich <<a href="mailto:popovich_sergei@mail.ua" target="_blank">popovich_sergei@mail.ua</a>><br> ----<br> - src/update.c | 2 +-<br> - 1 file changed, 1 insertion(+), 1 deletion(-)<br> -<br> -diff --git a/src/update.c b/src/update.c<br> -index 8588f38..3cc6b67 100644<br> ---- a/src/update.c<br> -+++ b/src/update.c<br> -@@ -187,7 +187,7 @@ static bool nvdcve_data_ok(const char *meta, const char *data)<br> -         snprintf(&csum_data[idx], len, "%02hhx", digest[i]);<br> -     }<br> - <br> --    ret = streq(csum_meta, csum_data);<br> -+    ret = !strcasecmp(csum_meta, csum_data);<br> - <br> - err_unmap:<br> -     munmap(buffer, length);<br> --- <br> -2.11.0<br> -<br> diff --git a/meta/recipes-devtools/cve-check-tool/files/check-for-malloc_trim-before-using-it.patch b/meta/recipes-devtools/cve-check-tool/files/check-for-malloc_trim-before-using-it.patch<br> deleted file mode 100644<br> index 0774ad946a4..00000000000<br> --- a/meta/recipes-devtools/cve-check-tool/files/check-for-malloc_trim-before-using-it.patch<br> +++ /dev/null<br> @@ -1,51 +0,0 @@<br> -From ce64633b9733e962b8d8482244301f614d8b5845 Mon Sep 17 00:00:00 2001<br> -From: Khem Raj <<a href="mailto:raj.khem@gmail.com" target="_blank">raj.khem@gmail.com</a>><br> -Date: Mon, 22 Aug 2016 22:54:24 -0700<br> -Subject: [PATCH] Check for malloc_trim before using it<br> -<br> -malloc_trim is gnu specific and not all libc<br> -implement it, threfore write a configure check<br> -to poke for it first and use the define to<br> -guard its use.<br> -<br> -Helps in compiling on musl based systems<br> -<br> -Signed-off-by: Khem Raj <<a href="mailto:raj.khem@gmail.com" target="_blank">raj.khem@gmail.com</a>><br> ----<br> -Upstream-Status: Submitted [<a href="https://github.com/ikeydoherty/cve-check-tool/pull/48" rel="noreferrer" target="_blank">https://github.com/ikeydoherty/cve-check-tool/pull/48</a>]<br> - <a href="http://configure.ac" rel="noreferrer" target="_blank">configure.ac</a> | 2 ++<br> - src/core.c  | 4 ++--<br> - 2 files changed, 4 insertions(+), 2 deletions(-)<br> -<br> -diff --git a/<a href="http://configure.ac" rel="noreferrer" target="_blank">configure.ac</a> b/<a href="http://configure.ac" rel="noreferrer" target="_blank">configure.ac</a><br> -index d3b66ce..79c3542 100644<br> ---- a/<a href="http://configure.ac" rel="noreferrer" target="_blank">configure.ac</a><br> -+++ b/<a href="http://configure.ac" rel="noreferrer" target="_blank">configure.ac</a><br> -@@ -19,6 +19,8 @@ m4_define([json_required_version], [0.16.0])<br> - m4_define([openssl_required_version],[1.0.0])<br> - # TODO: Set minimum sqlite<br> - <br> -+AC_CHECK_FUNCS_ONCE(malloc_trim)<br> -+<br> - PKG_CHECK_MODULES(CVE_CHECK_TOOL,<br> -         [<br> -          glib-2.0 >= glib_required_version,<br> -diff --git a/src/core.c b/src/core.c<br> -index 6263031..0d5df29 100644<br> ---- a/src/core.c<br> -+++ b/src/core.c<br> -@@ -498,9 +498,9 @@ bool cve_db_load(CveDB *self, const char *fname)<br> -     }<br> - <br> -     b = true;<br> --<br> -+#ifdef HAVE_MALLOC_TRIM<br> -     malloc_trim(0);<br> --<br> -+#endif<br> -     xmlFreeTextReader(r);<br> -     if (fd) {<br> -         close(fd);<br> --- <br> -2.9.3<br> -<br> -- <br> 2.20.1<br> <br> -- <br> _______________________________________________<br> Openembedded-core mailing list<br> <a href="mailto:Openembedded-core@lists.openembedded.org" target="_blank">Openembedded-core@lists.openembedded.org</a><br> <a href="http://lists.openembedded.org/mailman/listinfo/openembedded-core" rel="noreferrer" target="_blank">http://lists.openembedded.org/mailman/listinfo/openembedded-core</a><br> </blockquote></div></div> -- _______________________________________________ Openembedded-core mailing list Openembedded-core@lists.openembedded.org http://lists.openembedded.org/mailman/listinfo/openembedded-core
Hi, On Wed, Nov 06, 2019 at 02:59:16PM +0000, Ryan Harkin wrote: > Hi Ross/Richard, > > I'd like this applied to Sumo also. Should I create a new patch and send it > to the list, or is there a process for requesting this is cherry-picked > across? I just posted the port of this and all other CVE scan related changes to sumo http://lists.openembedded.org/pipermail/openembedded-core/2019-November/288817.html But the question is valid :) -Mikko -- _______________________________________________ Openembedded-core mailing list Openembedded-core@lists.openembedded.org http://lists.openembedded.org/mailman/listinfo/openembedded-core
On Wed, 2019-11-06 at 16:06 +0000, Mikko.Rapeli@bmw.de wrote: > Hi, > > On Wed, Nov 06, 2019 at 02:59:16PM +0000, Ryan Harkin wrote: > > Hi Ross/Richard, > > > > I'd like this applied to Sumo also. Should I create a new patch and > > send it > > to the list, or is there a process for requesting this is cherry- > > picked > > across? > > I just posted the port of this and all other CVE scan related changes > to sumo > http://lists.openembedded.org/pipermail/openembedded-core/2019-November/288817.html > > But the question is valid :) Support for sumo officially ended. I can see a case that the broken CVE tools there are a good reason we could consider merging the patch series but we do need to be able to test it to merge it to the main branch. If we can't test, we're merging blind and the quality the project tries to deliver could be compromised. I have made some tweaks to the autobuilder which bring us closer to being able to test sumo using the workers still around from that release. The things that make me nervous are questions like: Which releases do we "open" for such patches? How far back do we go? Which kinds of patches are acceptable? Note that sumo (and earlier) doesn't have much of the QA automation which we've now built our processes around so we don't get test reports. You mention wanting to change gcc. That means we really do need a full retest of it to merge that (which is why it never happened originally from what I remember). Also, the LTS proposal stated we needed someone to handle this work. We have no such person, even if we do somehow find them, they can't be expected to cover all the old releases and effectively turn all of them into LTS releases. How can we get the funding to try and get some help with handling this workload? I am probably going to try and make a case for sorting the CVE tooling on sumo as I agree its bad and we should do something. Where do we draw the line though. Basically, this looks like it could create a lot of extra work without helping the core project under-resourcing we currently struggle with. You can therefore see why I might be nervous :/. Cheers, Richard -- _______________________________________________ Openembedded-core mailing list Openembedded-core@lists.openembedded.org http://lists.openembedded.org/mailman/listinfo/openembedded-core
On Thu, 7 Nov 2019 at 07:59, <Mikko.Rapeli@bmw.de> wrote: > Hi, > > On Wed, Nov 06, 2019 at 05:53:27PM +0000, Richard Purdie wrote: > > On Wed, 2019-11-06 at 16:06 +0000, Mikko.Rapeli@bmw.de wrote: > > > Hi, > > > > > > On Wed, Nov 06, 2019 at 02:59:16PM +0000, Ryan Harkin wrote: > > > > Hi Ross/Richard, > > > > > > > > I'd like this applied to Sumo also. Should I create a new patch and > > > > send it > > > > to the list, or is there a process for requesting this is cherry- > > > > picked > > > > across? > > > > > > I just posted the port of this and all other CVE scan related changes > > > to sumo > > > > http://lists.openembedded.org/pipermail/openembedded-core/2019-November/288817.html > > > > Thanks Mikko! That's a great help, I guess my question was good timing for our mutual interest in Sumo. > > But the question is valid :) > > > > Support for sumo officially ended. I can see a case that the broken CVE > > tools there are a good reason we could consider merging the patch > > series but we do need to be able to test it to merge it to the main > > branch. If we can't test, we're merging blind and the quality the > > project tries to deliver could be compromised. > > > > I have made some tweaks to the autobuilder which bring us closer to > > being able to test sumo using the workers still around from that > > release. > > > > The things that make me nervous are questions like: > > > > Which releases do we "open" for such patches? How far back do we go? > > Which kinds of patches are acceptable? > > > > Note that sumo (and earlier) doesn't have much of the QA automation > > which we've now built our processes around so we don't get test > > reports. > > > > You mention wanting to change gcc. That means we really do need a full > > retest of it to merge that (which is why it never happened originally > > from what I remember). > > > > Also, the LTS proposal stated we needed someone to handle this work. We > > have no such person, even if we do somehow find them, they can't be > > expected to cover all the old releases and effectively turn all of them > > into LTS releases. How can we get the funding to try and get some help > > with handling this workload? > > > > I am probably going to try and make a case for sorting the CVE tooling > > on sumo as I agree its bad and we should do something. Where do we draw > > the line though. > > > > Basically, this looks like it could create a lot of extra work without > > helping the core project under-resourcing we currently struggle with. > > You can therefore see why I might be nervous :/. > > All this is understood. > Agreed. It's an expensive and tricky task. > I need to maintain sumo in a project for a while longer so I can publish > that work. > The CVE checker patches are just a start. > Yes, the same is true for me. I need to maintain a Sumo distro until mid-2020, at least. It uses the poky merge branch [1]. My support may extend further when the time comes. > > Providing funding for Yocto Project LTS work is possible but a lot harder > for me to do. > Testing and publishing patches is much easier. > Not sure if it helps, but I have a Jenkins job that tests sumo on a trigger (there is one for Warrior also): https://ci.linaro.org/job/warp7-openembedded-sumo/ eg. it was triggered when Armin's patch was merged yesterday. This builds Sumo, based on Linaro's OE-RPB distro for NXP WaRP7 (imx7s-warp). It then runs the build in our LAVA lab (although the boards have gone down recently, they're normally up). Once the boards are up again, I'll add ptest to the job, to give it a more thorough workout. I'll also add the sumo-next branch to the list of build configurations. > Could you clarify Yocto Project side answers to these questions: > > If I continue to publish patches for sumo, can I continue doing so on > oe-core mailing list? > > If I continue to collect patches for sumo, can I do so using Yocto Project > infrastructure, e.g. > a sumo-contrib-lts or similar branch in poky git tree? > > If I continue to test patches, what would be the patch acceptance criteria > and required testing? > I would assume same as stable release rules, but maybe these need to be > even stricter, e.g. > only support building on Debian stable, following the LTS proposal. I'm > testing in my own project > trees and CI with target HW, and doing world builds on pure poky with qemu > target. I could some > kind of ptest execution to plain poky as well. > > Would any testing of patches be possible in Yocto Project infrastructure? > > All of these things I can do also completely outside of Yocto Project, > e.g. publish a sumo > git tree on github, and rely only on my own testing. But I'd like to see > some co-operation here from other users who are stuck with sumo. > > -Mikko [1] http://git.yoctoproject.org/git/poky <div dir="ltr"><div dir="ltr"><br></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, 7 Nov 2019 at 07:59, <<a href="mailto:Mikko.Rapeli@bmw.de">Mikko.Rapeli@bmw.de</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">Hi,<br> <br> On Wed, Nov 06, 2019 at 05:53:27PM +0000, Richard Purdie wrote:<br> > On Wed, 2019-11-06 at 16:06 +0000, <a href="mailto:Mikko.Rapeli@bmw.de" target="_blank">Mikko.Rapeli@bmw.de</a> wrote:<br> > > Hi,<br> > > <br> > > On Wed, Nov 06, 2019 at 02:59:16PM +0000, Ryan Harkin wrote:<br> > > > Hi Ross/Richard,<br> > > > <br> > > > I'd like this applied to Sumo also. Should I create a new patch and<br> > > > send it<br> > > > to the list, or is there a process for requesting this is cherry-<br> > > > picked<br> > > > across?<br> > > <br> > > I just posted the port of this and all other CVE scan related changes<br> > > to sumo<br> > > <a href="http://lists.openembedded.org/pipermail/openembedded-core/2019-November/288817.html" rel="noreferrer" target="_blank">http://lists.openembedded.org/pipermail/openembedded-core/2019-November/288817.html</a><br> > > <br></blockquote><div><br></div><div>Thanks Mikko! That's a great help, I guess my question was good timing for our mutual interest in Sumo.</div><div><br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"> > > But the question is valid :)<br> > <br> > Support for sumo officially ended. I can see a case that the broken CVE<br> > tools there are a good reason we could consider merging the patch<br> > series but we do need to be able to test it to merge it to the main<br> > branch. If we can't test, we're merging blind and the quality the<br> > project tries to deliver could be compromised.<br> > <br> > I have made some tweaks to the autobuilder which bring us closer to<br> > being able to test sumo using the workers still around from that<br> > release.<br> > <br> > The things that make me nervous are questions like:<br> > <br> > Which releases do we "open" for such patches? How far back do we go?<br> > Which kinds of patches are acceptable?<br> > <br> > Note that sumo (and earlier) doesn't have much of the QA automation<br> > which we've now built our processes around so we don't get test<br> > reports.<br> > <br> > You mention wanting to change gcc. That means we really do need a full<br> > retest of it to merge that (which is why it never happened originally<br> > from what I remember).<br> > <br> > Also, the LTS proposal stated we needed someone to handle this work. We<br> > have no such person, even if we do somehow find them, they can't be<br> > expected to cover all the old releases and effectively turn all of them<br> > into LTS releases. How can we get the funding to try and get some help<br> > with handling this workload?<br> > <br> > I am probably going to try and make a case for sorting the CVE tooling<br> > on sumo as I agree its bad and we should do something. Where do we draw<br> > the line though.<br> > <br> > Basically, this looks like it could create a lot of extra work without<br> > helping the core project under-resourcing we currently struggle with.<br> > You can therefore see why I might be nervous :/.<br> <br> All this is understood.<br></blockquote><div><br></div><div>Agreed. It's an expensive and tricky task.</div><div><br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"> <br> I need to maintain sumo in a project for a while longer so I can publish that work.<br> The CVE checker patches are just a start.<br></blockquote><div><br></div><div>Yes, the same is true for me. I need to maintain a Sumo distro until mid-2020, at least. It uses the poky merge branch [1]. My support may extend further when the time comes.</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"> <br> Providing funding for Yocto Project LTS work is possible but a lot harder for me to do.<br> Testing and publishing patches is much easier.<br></blockquote><div><br></div><div>Not sure if it helps, but I have a Jenkins job that tests sumo on a trigger (there is one for Warrior also):</div><div><br></div><div><a href="https://ci.linaro.org/job/warp7-openembedded-sumo/">https://ci.linaro.org/job/warp7-openembedded-sumo/</a></div><div><br></div><div>eg. it was triggered when Armin's patch was merged yesterday.</div><div><br></div><div><div>This builds Sumo, based on Linaro's OE-RPB distro for NXP WaRP7 (imx7s-warp). It then runs the build in our LAVA lab (although the boards have gone down recently, they're normally up). Once the boards are up again, I'll add ptest to the job, to give it a more thorough workout. I'll also add the sumo-next branch to the list of build configurations.</div></div><div><br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"> <br> Could you clarify Yocto Project side answers to these questions:<br> <br> If I continue to publish patches for sumo, can I continue doing so on oe-core mailing list?<br> <br> If I continue to collect patches for sumo, can I do so using Yocto Project infrastructure, e.g.<br> a sumo-contrib-lts or similar branch in poky git tree?<br> <br> If I continue to test patches, what would be the patch acceptance criteria and required testing?<br> I would assume same as stable release rules, but maybe these need to be even stricter, e.g.<br> only support building on Debian stable, following the LTS proposal. I'm testing in my own project<br> trees and CI with target HW, and doing world builds on pure poky with qemu target. I could some<br> kind of ptest execution to plain poky as well.<br> <br> Would any testing of patches be possible in Yocto Project infrastructure?<br> <br> All of these things I can do also completely outside of Yocto Project, e.g. publish a sumo<br> git tree on github, and rely only on my own testing. But I'd like to see<br> some co-operation here from other users who are stuck with sumo.<br> <br> -Mikko</blockquote><div><br></div><div>[1] <a href="http://git.yoctoproject.org/git/poky">http://git.yoctoproject.org/git/poky</a> </div></div></div> -- _______________________________________________ Openembedded-core mailing list Openembedded-core@lists.openembedded.org http://lists.openembedded.org/mailman/listinfo/openembedded-core
Hi, On Thu, Nov 07, 2019 at 10:41:42AM +0000, Ryan Harkin wrote: <snip> > Not sure if it helps, but I have a Jenkins job that tests sumo on a trigger > (there is one for Warrior also): > > https://ci.linaro.org/job/warp7-openembedded-sumo/ > > eg. it was triggered when Armin's patch was merged yesterday. > > This builds Sumo, based on Linaro's OE-RPB distro for NXP WaRP7 > (imx7s-warp). It then runs the build in our LAVA lab (although the boards > have gone down recently, they're normally up). Once the boards are up > again, I'll add ptest to the job, to give it a more thorough workout. I'll > also add the sumo-next branch to the list of build configurations. Could it make sense to also test sumo-next using this job? Also, I would not mind seeing test trigger and result emails on this list as long as access to logs and details is open to everyone. I checked the jenkins build job config and looks quite straight forward. Did not see the actual target testing logs though. Cheers, -Mikko -- _______________________________________________ Openembedded-core mailing list Openembedded-core@lists.openembedded.org http://lists.openembedded.org/mailman/listinfo/openembedded-core
Hi Mikko, On Mon, 11 Nov 2019 at 08:08, <Mikko.Rapeli@bmw.de> wrote: > > Hi, > > On Thu, Nov 07, 2019 at 10:41:42AM +0000, Ryan Harkin wrote: > <snip> > > Not sure if it helps, but I have a Jenkins job that tests sumo on a trigger > > (there is one for Warrior also): > > > > https://ci.linaro.org/job/warp7-openembedded-sumo/ > > > > eg. it was triggered when Armin's patch was merged yesterday. > > > > This builds Sumo, based on Linaro's OE-RPB distro for NXP WaRP7 > > (imx7s-warp). It then runs the build in our LAVA lab (although the boards > > have gone down recently, they're normally up). Once the boards are up > > again, I'll add ptest to the job, to give it a more thorough workout. I'll > > also add the sumo-next branch to the list of build configurations. > > Could it make sense to also test sumo-next using this job? Once the boards are recovered in the lab, I'll create sumo-next and warrior-next trigger jobs and add a few more tests. > Also, I would not mind > seeing test trigger and result emails on this list as long as access to logs and details > is open to everyone. I believe this is public. At least, I can see everything from a private browser window where I'm not logged in. It's trivial for me to add this mailing list to the job, although I wonder if it'll bounce due to the email address not being subscribed? I don't have access to the inbox where the LAVA emails are sent from (lava@validation.linaro.org). It may be possible for the lab team to subscribe to the list, I'd have to raise a support request. > > I checked the jenkins build job config and looks quite straight forward. Did not see > the actual target testing logs though. That's because the boards have gone "bad" in the lab. Builds 16 and 17 are still queued, waiting for a "good" board to run on. I've raised a support ticket to recover the boards. Build 15 was the last Sumo job to run, and it's been purged from Jenkins since it's over 30 days old. However, our SQUAD and LAVA instances keep their data forever (at the moment). The jobs are still available, if you know where to find them. The binaries have not been purged yet (pinned-manifest.xml will tell you what was built): https://snapshots.linaro.org/openembedded/warp7/sumo/imx7s-warp/15/rpb/ SQUAD data for Sumo on WaRP7: https://qa-reports.linaro.org/warp7/warp7-bsp/build/16a83e5/ SQUAD results for Sumo build 15 on WaRP7: https://qa-reports.linaro.org/warp7/warp7-bsp/build/16a83e5/testrun/1937355/ LAVA job for Sumo build 15 on WaRP7: https://validation.linaro.org/scheduler/job/1937355 Testing wise, I'm only really running boot testing at the moment. The test job makes sure the board boots to the commandline. Then it attempts to run "memtester", which isn't installed, so fails. I did that on purpose originally, because I wanted to make sure my memtester test failed properly when not installed. I should update that now I'm sure it works. Next it runs badblocks across the rootfs partition (/dev/mmcblk1p2) in read-only mode. Regards, Ryan. > > > Cheers, > > -Mikko <div dir="ltr">Hi Mikko,<br><br>On Mon, 11 Nov 2019 at 08:08, <<a href="mailto:Mikko.Rapeli@bmw.de">Mikko.Rapeli@bmw.de</a>> wrote:<br>><br>> Hi,<br>><br>> On Thu, Nov 07, 2019 at 10:41:42AM +0000, Ryan Harkin wrote:<br>> <snip><br>> > Not sure if it helps, but I have a Jenkins job that tests sumo on a trigger<br>> > (there is one for Warrior also):<br>> ><br>> > <a href="https://ci.linaro.org/job/warp7-openembedded-sumo/">https://ci.linaro.org/job/warp7-openembedded-sumo/</a><br>> ><br>> > eg. it was triggered when Armin's patch was merged yesterday.<br>> ><br>> > This builds Sumo, based on Linaro's OE-RPB distro for NXP WaRP7<br>> > (imx7s-warp). It then runs the build in our LAVA lab (although the boards<br>> > have gone down recently, they're normally up). Once the boards are up<br>> > again, I'll add ptest to the job, to give it a more thorough workout. I'll<br>> > also add the sumo-next branch to the list of build configurations.<br>><br>> Could it make sense to also test sumo-next using this job?<br><br>Once the boards are recovered in the lab, I'll create sumo-next and warrior-next trigger jobs and add a few more tests.<br>Â <br>>Â Also, I would not mind<br>> seeing test trigger and result emails on this list as long as access to logs and details<br>> is open to everyone.<br><br>I believe this is public. At least, I can see everything from a private browser window where I'm not logged in. It's trivial for me to add this mailing list to the job, although I wonder if it'll bounce due to the email address not being subscribed? I don't have access to the inbox where the LAVA emails are sent from (<a href="mailto:lava@validation.linaro.org">lava@validation.linaro.org</a>). It may be possible for the lab team to subscribe to the list, I'd have to raise a support request.<br><br>><br>> I checked the jenkins build job config and looks quite straight forward. Did not see<br>> the actual target testing logs though.<br><br>That's because the boards have gone "bad" in the lab. Builds 16 and 17 are still queued, waiting for a "good" board to run on. I've raised a support ticket to recover the boards.<br><br>Build 15 was the last Sumo job to run, and it's been purged from Jenkins since it's over 30 days old. However, our SQUAD and LAVA instances keep their data forever (at the moment). The jobs are still available, if you know where to find them.<br><br>The binaries have not been purged yet (pinned-manifest.xml will tell you what was built):<br>Â <a href="https://snapshots.linaro.org/openembedded/warp7/sumo/imx7s-warp/15/rpb/">https://snapshots.linaro.org/openembedded/warp7/sumo/imx7s-warp/15/rpb/</a><br><br>SQUAD data for Sumo on WaRP7:<br>Â <a href="https://qa-reports.linaro.org/warp7/warp7-bsp/build/16a83e5/">https://qa-reports.linaro.org/warp7/warp7-bsp/build/16a83e5/</a><br><br>SQUAD results for Sumo build 15 on WaRP7:<br>Â <a href="https://qa-reports.linaro.org/warp7/warp7-bsp/build/16a83e5/testrun/1937355/">https://qa-reports.linaro.org/warp7/warp7-bsp/build/16a83e5/testrun/1937355/</a><br><br>LAVA job for Sumo build 15 on WaRP7:<br>Â <a href="https://validation.linaro.org/scheduler/job/1937355">https://validation.linaro.org/scheduler/job/1937355</a><br><br>Testing wise, I'm only really running boot testing at the moment. The test job makes sure the board boots to the commandline. Then it attempts to run "memtester", which isn't installed, so fails. I did that on purpose originally, because I wanted to make sure my memtester test failed properly when not installed. I should update that now I'm sure it works. Next it runs badblocks across the rootfs partition (/dev/mmcblk1p2) in read-only mode.<br><br>Regards,<div>Ryan.<br>Â <br>><br>><br>> Cheers,<br>><br>> -Mikko</div></div> -- _______________________________________________ Openembedded-core mailing list Openembedded-core@lists.openembedded.org http://lists.openembedded.org/mailman/listinfo/openembedded-core
On Thu, 2019-11-07 at 07:59 +0000, Mikko.Rapeli@bmw.de wrote: > I need to maintain sumo in a project for a while longer so I can > publish that work. The CVE checker patches are just a start. > > Providing funding for Yocto Project LTS work is possible but a lot > harder for me to do. Testing and publishing patches is much easier. > > Could you clarify Yocto Project side answers to these questions: I just want to be clear I'm not ignoring this but I need the TSC to figure a few things out before we can answer this. I suspect that may take a short while as its tied into the discussions about LTS and that rests with the YP governing board at the moment. We need to consistently handle everything with a plan so patience with this is appreciated! Cheers, Richard -- _______________________________________________ Openembedded-core mailing list Openembedded-core@lists.openembedded.org http://lists.openembedded.org/mailman/listinfo/openembedded-core
On 11/6/19 11:59 PM, Mikko.Rapeli@bmw.de wrote: > Hi, > > On Wed, Nov 06, 2019 at 05:53:27PM +0000, Richard Purdie wrote: >> On Wed, 2019-11-06 at 16:06 +0000, Mikko.Rapeli@bmw.de wrote: >>> Hi, >>> >>> On Wed, Nov 06, 2019 at 02:59:16PM +0000, Ryan Harkin wrote: >>>> Hi Ross/Richard, >>>> >>>> I'd like this applied to Sumo also. Should I create a new patch and >>>> send it >>>> to the list, or is there a process for requesting this is cherry- >>>> picked >>>> across? >>> I just posted the port of this and all other CVE scan related changes >>> to sumo >>> http://lists.openembedded.org/pipermail/openembedded-core/2019-November/288817.html >>> >>> But the question is valid :) >> Support for sumo officially ended. I can see a case that the broken CVE >> tools there are a good reason we could consider merging the patch >> series but we do need to be able to test it to merge it to the main >> branch. If we can't test, we're merging blind and the quality the >> project tries to deliver could be compromised. >> >> I have made some tweaks to the autobuilder which bring us closer to >> being able to test sumo using the workers still around from that >> release. >> >> The things that make me nervous are questions like: >> >> Which releases do we "open" for such patches? How far back do we go? >> Which kinds of patches are acceptable? >> >> Note that sumo (and earlier) doesn't have much of the QA automation >> which we've now built our processes around so we don't get test >> reports. >> >> You mention wanting to change gcc. That means we really do need a full >> retest of it to merge that (which is why it never happened originally >> from what I remember). >> >> Also, the LTS proposal stated we needed someone to handle this work. We >> have no such person, even if we do somehow find them, they can't be >> expected to cover all the old releases and effectively turn all of them >> into LTS releases. How can we get the funding to try and get some help >> with handling this workload? >> >> I am probably going to try and make a case for sorting the CVE tooling >> on sumo as I agree its bad and we should do something. Where do we draw >> the line though. >> >> Basically, this looks like it could create a lot of extra work without >> helping the core project under-resourcing we currently struggle with. >> You can therefore see why I might be nervous :/. > All this is understood. > > I need to maintain sumo in a project for a while longer so I can publish that work. > The CVE checker patches are just a start. > > Providing funding for Yocto Project LTS work is possible but a lot harder for me to do. > Testing and publishing patches is much easier. > > Could you clarify Yocto Project side answers to these questions: > > If I continue to publish patches for sumo, can I continue doing so on oe-core mailing list? As far I understand it Sumo is under "Community supported" and now more and more patches are being sent. We should formalize this process IMHO. I don't mind collecting them but they wont land in mainline as we need to address the regression for the other layers or until we change the policy. > > If I continue to collect patches for sumo, can I do so using Yocto Project infrastructure, e.g. > a sumo-contrib-lts or similar branch in poky git tree? Well if you get write permission, then the stable branch maintainer should have it too. You can use "https://git.openembedded.org/openembedded-core-contrib/log/?h=stable/sumo-community" Would we want a similar scheme in Poky-contrib? I would prefer patches being sent to the list before they land in the branch. If we decide to build, we can use those branches. Not sure where they would go from there. > > If I continue to test patches, what would be the patch acceptance criteria and required testing? > I would assume same as stable release rules, but maybe these need to be even stricter, e.g. > only support building on Debian stable, following the LTS proposal. I'm testing in my own project > trees and CI with target HW, and doing world builds on pure poky with qemu target. I could some > kind of ptest execution to plain poky as well. > > Would any testing of patches be possible in Yocto Project infrastructure? How about BMW join the Project. Cash might help support such an endeavor. > > All of these things I can do also completely outside of Yocto Project, e.g. publish a sumo > git tree on github, and rely only on my own testing. But I'd like to see > some co-operation here from other users who are stuck with sumo. I would prefer not to see a fork situation expect in a last resort. let see what we can come up with. regards, armin > > -Mikko
diff --git a/meta/classes/cve-check.bbclass b/meta/classes/cve-check.bbclass index 743bc08a4f9..c00d2910be1 100644 --- a/meta/classes/cve-check.bbclass +++ b/meta/classes/cve-check.bbclass @@ -26,7 +26,7 @@ CVE_PRODUCT ??= "${BPN}" CVE_VERSION ??= "${PV}" CVE_CHECK_DB_DIR ?= "${DL_DIR}/CVE_CHECK" -CVE_CHECK_DB_FILE ?= "${CVE_CHECK_DB_DIR}/nvd.db" +CVE_CHECK_DB_FILE ?= "${CVE_CHECK_DB_DIR}/nvdcve_1.0.db" CVE_CHECK_LOG ?= "${T}/cve.log" CVE_CHECK_TMP_FILE ?= "${TMPDIR}/cve_check" @@ -37,32 +37,33 @@ CVE_CHECK_COPY_FILES ??= "1" CVE_CHECK_CREATE_MANIFEST ??= "1" # Whitelist for packages (PN) -CVE_CHECK_PN_WHITELIST = "\ - glibc-locale \ -" +CVE_CHECK_PN_WHITELIST ?= "" -# Whitelist for CVE and version of package -CVE_CHECK_CVE_WHITELIST = "{\ - 'CVE-2014-2524': ('6.3','5.2',), \ -}" +# Whitelist for CVE. If a CVE is found, then it is considered patched. +# The value is a string containing space separated CVE values: +# +# CVE_CHECK_WHITELIST = 'CVE-2014-2524 CVE-2018-1234' +# +CVE_CHECK_WHITELIST ?= "" python do_cve_check () { """ Check recipe for patched and unpatched CVEs """ - if os.path.exists(d.getVar("CVE_CHECK_TMP_FILE")): + if os.path.exists(d.getVar("CVE_CHECK_DB_FILE")): patched_cves = get_patches_cves(d) patched, unpatched = check_cves(d, patched_cves) if patched or unpatched: cve_data = get_cve_info(d, patched + unpatched) cve_write_data(d, patched, unpatched, cve_data) else: - bb.note("Failed to update CVE database, skipping CVE check") + bb.note("No CVE database found, skipping CVE check") + } addtask cve_check after do_unpack before do_build -do_cve_check[depends] = "cve-check-tool-native:do_populate_sysroot cve-check-tool-native:do_populate_cve_db" +do_cve_check[depends] = "cve-update-db-native:do_populate_cve_db" do_cve_check[nostamp] = "1" python cve_check_cleanup () { @@ -163,65 +164,94 @@ def get_patches_cves(d): def check_cves(d, patched_cves): """ - Run cve-check-tool looking for patched and unpatched CVEs. + Connect to the NVD database and find unpatched cves. """ - import ast, csv, tempfile, subprocess, io + from distutils.version import LooseVersion - cves_patched = [] cves_unpatched = [] - bpn = d.getVar("CVE_PRODUCT") + # CVE_PRODUCT can contain more than one product (eg. curl/libcurl) + products = d.getVar("CVE_PRODUCT").split() # If this has been unset then we're not scanning for CVEs here (for example, image recipes) - if not bpn: + if not products: return ([], []) pv = d.getVar("CVE_VERSION").split("+git")[0] - cves = " ".join(patched_cves) - cve_db_dir = d.getVar("CVE_CHECK_DB_DIR") - cve_whitelist = ast.literal_eval(d.getVar("CVE_CHECK_CVE_WHITELIST")) - cve_cmd = "cve-check-tool" - cmd = [cve_cmd, "--no-html", "--skip-update", "--csv", "--not-affected", "-t", "faux", "-d", cve_db_dir] # If the recipe has been whitlisted we return empty lists if d.getVar("PN") in d.getVar("CVE_CHECK_PN_WHITELIST").split(): bb.note("Recipe has been whitelisted, skipping check") return ([], []) - try: - # Write the faux CSV file to be used with cve-check-tool - fd, faux = tempfile.mkstemp(prefix="cve-faux-") - with os.fdopen(fd, "w") as f: - for pn in bpn.split(): - f.write("%s,%s,%s,\n" % (pn, pv, cves)) - cmd.append(faux) - - output = subprocess.check_output(cmd).decode("utf-8") - bb.debug(2, "Output of command %s:\n%s" % ("\n".join(cmd), output)) - except subprocess.CalledProcessError as e: - bb.warn("Couldn't check for CVEs: %s (output %s)" % (e, e.output)) - finally: - os.remove(faux) - - for row in csv.reader(io.StringIO(output)): - # Third row has the unpatched CVEs - if row[2]: - for cve in row[2].split(): - # Skip if the CVE has been whitlisted for the current version - if pv in cve_whitelist.get(cve,[]): - bb.note("%s-%s has been whitelisted for %s" % (bpn, pv, cve)) + old_cve_whitelist = d.getVar("CVE_CHECK_CVE_WHITELIST") + if old_cve_whitelist: + bb.warn("CVE_CHECK_CVE_WHITELIST is deprecated, please use CVE_CHECK_WHITELIST.") + cve_whitelist = d.getVar("CVE_CHECK_WHITELIST").split() + + import sqlite3 + db_file = d.getVar("CVE_CHECK_DB_FILE") + conn = sqlite3.connect(db_file) + + for product in products: + c = conn.cursor() + if ":" in product: + vendor, product = product.split(":", 1) + c.execute("SELECT * FROM PRODUCTS WHERE PRODUCT IS ? AND VENDOR IS ?", (product, vendor)) + else: + c.execute("SELECT * FROM PRODUCTS WHERE PRODUCT IS ?", (product,)) + + for row in c: + cve = row[0] + version_start = row[3] + operator_start = row[4] + version_end = row[5] + operator_end = row[6] + + if cve in cve_whitelist: + bb.note("%s-%s has been whitelisted for %s" % (product, pv, cve)) + elif cve in patched_cves: + bb.note("%s has been patched" % (cve)) + else: + to_append = False + if (operator_start == '=' and pv == version_start): + cves_unpatched.append(cve) else: + if operator_start: + try: + to_append_start = (operator_start == '>=' and LooseVersion(pv) >= LooseVersion(version_start)) + to_append_start |= (operator_start == '>' and LooseVersion(pv) > LooseVersion(version_start)) + except: + bb.note("%s: Failed to compare %s %s %s for %s" % + (product, pv, operator_start, version_start, cve)) + to_append_start = False + else: + to_append_start = False + + if operator_end: + try: + to_append_end = (operator_end == '<=' and LooseVersion(pv) <= LooseVersion(version_end)) + to_append_end |= (operator_end == '<' and LooseVersion(pv) < LooseVersion(version_end)) + except: + bb.note("%s: Failed to compare %s %s %s for %s" % + (product, pv, operator_end, version_end, cve)) + to_append_end = False + else: + to_append_end = False + + if operator_start and operator_end: + to_append = to_append_start and to_append_end + else: + to_append = to_append_start or to_append_end + + if to_append: cves_unpatched.append(cve) - bb.debug(2, "%s-%s is not patched for %s" % (bpn, pv, cve)) - # Fourth row has patched CVEs - if row[3]: - for cve in row[3].split(): - cves_patched.append(cve) - bb.debug(2, "%s-%s is patched for %s" % (bpn, pv, cve)) + bb.debug(2, "%s-%s is not patched for %s" % (product, pv, cve)) + conn.close() - return (cves_patched, cves_unpatched) + return (list(patched_cves), cves_unpatched) def get_cve_info(d, cves): """ - Get CVE information from the database used by cve-check-tool. + Get CVE information from the database. Unfortunately the only way to get CVE info is set the output to html (hard to parse) or query directly the database. @@ -241,9 +271,10 @@ def get_cve_info(d, cves): for row in cur.execute(query, tuple(cves)): cve_data[row[0]] = {} cve_data[row[0]]["summary"] = row[1] - cve_data[row[0]]["score"] = row[2] - cve_data[row[0]]["modified"] = row[3] - cve_data[row[0]]["vector"] = row[4] + cve_data[row[0]]["scorev2"] = row[2] + cve_data[row[0]]["scorev3"] = row[3] + cve_data[row[0]]["modified"] = row[4] + cve_data[row[0]]["vector"] = row[5] conn.close() return cve_data @@ -270,7 +301,8 @@ def cve_write_data(d, patched, unpatched, cve_data): unpatched_cves.append(cve) write_string += "CVE STATUS: Unpatched\n" write_string += "CVE SUMMARY: %s\n" % cve_data[cve]["summary"] - write_string += "CVSS v2 BASE SCORE: %s\n" % cve_data[cve]["score"] + write_string += "CVSS v2 BASE SCORE: %s\n" % cve_data[cve]["scorev2"] + write_string += "CVSS v3 BASE SCORE: %s\n" % cve_data[cve]["scorev3"] write_string += "VECTOR: %s\n" % cve_data[cve]["vector"] write_string += "MORE INFORMATION: %s%s\n\n" % (nvd_link, cve) diff --git a/meta/conf/distro/include/maintainers.inc b/meta/conf/distro/include/maintainers.inc index 672f0677922..c027901fdf0 100644 --- a/meta/conf/distro/include/maintainers.inc +++ b/meta/conf/distro/include/maintainers.inc @@ -116,6 +116,7 @@ RECIPE_MAINTAINER_pn-cryptodev-tests = "Robert Yang <liezhi.yang@windriver.com>" RECIPE_MAINTAINER_pn-cups = "Chen Qi <Qi.Chen@windriver.com>" RECIPE_MAINTAINER_pn-curl = "Armin Kuster <akuster808@gmail.com>" RECIPE_MAINTAINER_pn-cve-check-tool = "Ross Burton <ross.burton@intel.com>" +RECIPE_MAINTAINER_pn-cve-update-db-native = "Ross Burton <ross.burton@intel.com>" RECIPE_MAINTAINER_pn-cwautomacros = "Ross Burton <ross.burton@intel.com>" RECIPE_MAINTAINER_pn-db = "Mark Hatle <mark.hatle@windriver.com>" RECIPE_MAINTAINER_pn-dbus = "Chen Qi <Qi.Chen@windriver.com>" diff --git a/meta/recipes-core/glibc/glibc-locale.inc b/meta/recipes-core/glibc/glibc-locale.inc index 1b676dc26e7..97d83cb856d 100644 --- a/meta/recipes-core/glibc/glibc-locale.inc +++ b/meta/recipes-core/glibc/glibc-locale.inc @@ -95,3 +95,6 @@ do_install () { inherit libc-package BBCLASSEXTEND = "nativesdk" + +# Don't scan for CVEs as glibc will be scanned +CVE_PRODUCT = "" diff --git a/meta/recipes-core/glibc/glibc-mtrace.inc b/meta/recipes-core/glibc/glibc-mtrace.inc index d703c14bdc1..ef9d60ec239 100644 --- a/meta/recipes-core/glibc/glibc-mtrace.inc +++ b/meta/recipes-core/glibc/glibc-mtrace.inc @@ -11,3 +11,6 @@ do_install() { install -d -m 0755 ${D}${bindir} install -m 0755 ${SRC}/mtrace ${D}${bindir}/ } + +# Don't scan for CVEs as glibc will be scanned +CVE_PRODUCT = "" diff --git a/meta/recipes-core/glibc/glibc-scripts.inc b/meta/recipes-core/glibc/glibc-scripts.inc index 2a2b41507ed..14a14e45126 100644 --- a/meta/recipes-core/glibc/glibc-scripts.inc +++ b/meta/recipes-core/glibc/glibc-scripts.inc @@ -18,3 +18,6 @@ do_install() { # sotruss script requires sotruss-lib.so (given by libsotruss package), # to produce trace of the library calls. RDEPENDS_${PN} += "libsotruss" + +# Don't scan for CVEs as glibc will be scanned +CVE_PRODUCT = "" diff --git a/meta/recipes-core/meta/cve-update-db-native.bb b/meta/recipes-core/meta/cve-update-db-native.bb new file mode 100644 index 00000000000..2c427a5884f --- /dev/null +++ b/meta/recipes-core/meta/cve-update-db-native.bb @@ -0,0 +1,195 @@ +SUMMARY = "Updates the NVD CVE database" +LICENSE = "MIT" + +INHIBIT_DEFAULT_DEPS = "1" + +inherit native + +deltask do_unpack +deltask do_patch +deltask do_configure +deltask do_compile +deltask do_install +deltask do_populate_sysroot + +python () { + if not d.getVar("CVE_CHECK_DB_FILE"): + raise bb.parse.SkipRecipe("Skip recipe when cve-check class is not loaded.") +} + +python do_populate_cve_db() { + """ + Update NVD database with json data feed + """ + + import sqlite3, urllib, urllib.parse, shutil, gzip + from datetime import date + + BASE_URL = "https://nvd.nist.gov/feeds/json/cve/1.0/nvdcve-1.0-" + YEAR_START = 2002 + + db_dir = os.path.join(d.getVar("DL_DIR"), 'CVE_CHECK') + db_file = os.path.join(db_dir, 'nvdcve_1.0.db') + json_tmpfile = os.path.join(db_dir, 'nvd.json.gz') + proxy = d.getVar("https_proxy") + + if proxy: + # instantiate an opener but do not install it as the global + # opener unless if we're really sure it's applicable for all + # urllib requests + proxy_handler = urllib.request.ProxyHandler({'https': proxy}) + proxy_opener = urllib.request.build_opener(proxy_handler) + else: + proxy_opener = None + + cve_f = open(os.path.join(d.getVar("TMPDIR"), 'cve_check'), 'a') + + if not os.path.isdir(db_dir): + os.mkdir(db_dir) + + # Connect to database + conn = sqlite3.connect(db_file) + c = conn.cursor() + + initialize_db(c) + + for year in range(YEAR_START, date.today().year + 1): + year_url = BASE_URL + str(year) + meta_url = year_url + ".meta" + json_url = year_url + ".json.gz" + + # Retrieve meta last modified date + + response = None + + if proxy_opener: + response = proxy_opener.open(meta_url) + else: + req = urllib.request.Request(meta_url) + response = urllib.request.urlopen(req) + + if response: + for l in response.read().decode("utf-8").splitlines(): + key, value = l.split(":", 1) + if key == "lastModifiedDate": + last_modified = value + break + else: + bb.warn("Cannot parse CVE metadata, update failed") + return + + # Compare with current db last modified date + c.execute("select DATE from META where YEAR = ?", (year,)) + meta = c.fetchone() + if not meta or meta[0] != last_modified: + # Clear products table entries corresponding to current year + c.execute("delete from PRODUCTS where ID like ?", ('CVE-%d%%' % year,)) + + # Update db with current year json file + try: + if proxy_opener: + response = proxy_opener.open(json_url) + else: + req = urllib.request.Request(json_url) + response = urllib.request.urlopen(req) + + if response: + update_db(c, gzip.decompress(response.read()).decode('utf-8')) + c.execute("insert or replace into META values (?, ?)", [year, last_modified]) + except urllib.error.URLError as e: + cve_f.write('Warning: CVE db update error, CVE data is outdated.\n\n') + bb.warn("Cannot parse CVE data (%s), update failed" % e.reason) + return + + # Update success, set the date to cve_check file. + if year == date.today().year: + cve_f.write('CVE database update : %s\n\n' % date.today()) + + cve_f.close() + conn.commit() + conn.close() +} + +def initialize_db(c): + c.execute("CREATE TABLE IF NOT EXISTS META (YEAR INTEGER UNIQUE, DATE TEXT)") + c.execute("CREATE TABLE IF NOT EXISTS NVD (ID TEXT UNIQUE, SUMMARY TEXT, \ + SCOREV2 TEXT, SCOREV3 TEXT, MODIFIED INTEGER, VECTOR TEXT)") + c.execute("CREATE TABLE IF NOT EXISTS PRODUCTS (ID TEXT, \ + VENDOR TEXT, PRODUCT TEXT, VERSION_START TEXT, OPERATOR_START TEXT, \ + VERSION_END TEXT, OPERATOR_END TEXT)") + +def parse_node_and_insert(c, node, cveId): + # Parse children node if needed + for child in node.get('children', ()): + parse_node_and_insert(c, child, cveId) + + def cpe_generator(): + for cpe in node.get('cpe_match', ()): + if not cpe['vulnerable']: + return + cpe23 = cpe['cpe23Uri'].split(':') + vendor = cpe23[3] + product = cpe23[4] + version = cpe23[5] + + if version != '*': + # Version is defined, this is a '=' match + yield [cveId, vendor, product, version, '=', '', ''] + else: + # Parse start version, end version and operators + op_start = '' + op_end = '' + v_start = '' + v_end = '' + + if 'versionStartIncluding' in cpe: + op_start = '>=' + v_start = cpe['versionStartIncluding'] + + if 'versionStartExcluding' in cpe: + op_start = '>' + v_start = cpe['versionStartExcluding'] + + if 'versionEndIncluding' in cpe: + op_end = '<=' + v_end = cpe['versionEndIncluding'] + + if 'versionEndExcluding' in cpe: + op_end = '<' + v_end = cpe['versionEndExcluding'] + + yield [cveId, vendor, product, v_start, op_start, v_end, op_end] + + c.executemany("insert into PRODUCTS values (?, ?, ?, ?, ?, ?, ?)", cpe_generator()) + +def update_db(c, jsondata): + import json + root = json.loads(jsondata) + + for elt in root['CVE_Items']: + if not elt['impact']: + continue + + cveId = elt['cve']['CVE_data_meta']['ID'] + cveDesc = elt['cve']['description']['description_data'][0]['value'] + date = elt['lastModifiedDate'] + accessVector = elt['impact']['baseMetricV2']['cvssV2']['accessVector'] + cvssv2 = elt['impact']['baseMetricV2']['cvssV2']['baseScore'] + + try: + cvssv3 = elt['impact']['baseMetricV3']['cvssV3']['baseScore'] + except: + cvssv3 = 0.0 + + c.execute("insert or replace into NVD values (?, ?, ?, ?, ?, ?)", + [cveId, cveDesc, cvssv2, cvssv3, date, accessVector]) + + configurations = elt['configurations']['nodes'] + for config in configurations: + parse_node_and_insert(c, config, cveId) + + +addtask do_populate_cve_db before do_fetch +do_populate_cve_db[nostamp] = "1" + +EXCLUDE_FROM_WORLD = "1" diff --git a/meta/recipes-devtools/cve-check-tool/cve-check-tool_5.6.4.bb b/meta/recipes-devtools/cve-check-tool/cve-check-tool_5.6.4.bb deleted file mode 100644 index 1c84fb1cf2d..00000000000 --- a/meta/recipes-devtools/cve-check-tool/cve-check-tool_5.6.4.bb +++ /dev/null @@ -1,62 +0,0 @@ -SUMMARY = "cve-check-tool" -DESCRIPTION = "cve-check-tool is a tool for checking known (public) CVEs.\ -The tool will identify potentially vunlnerable software packages within Linux distributions through version matching." -HOMEPAGE = "https://github.com/ikeydoherty/cve-check-tool" -SECTION = "Development/Tools" -LICENSE = "GPL-2.0+" -LIC_FILES_CHKSUM = "file://LICENSE;md5=e8c1458438ead3c34974bc0be3a03ed6" - -SRC_URI = "https://github.com/ikeydoherty/${BPN}/releases/download/v${PV}/${BP}.tar.xz \ - file://check-for-malloc_trim-before-using-it.patch \ - file://0001-print-progress-in-percent-when-downloading-CVE-db.patch \ - file://0001-curl-allow-overriding-default-CA-certificate-file.patch \ - file://0001-update-Compare-computed-vs-expected-sha256-digit-str.patch \ - file://0001-Fix-freeing-memory-allocated-by-sqlite.patch \ - " - -SRC_URI[md5sum] = "c5f4247140fc9be3bf41491d31a34155" -SRC_URI[sha256sum] = "b8f283be718af8d31232ac1bfc10a0378fb958aaaa49af39168f8acf501e6a5b" - -UPSTREAM_CHECK_URI = "https://github.com/ikeydoherty/cve-check-tool/releases" - -DEPENDS = "libcheck glib-2.0 json-glib curl libxml2 sqlite3 openssl ca-certificates" - -RDEPENDS_${PN} = "ca-certificates" - -inherit pkgconfig autotools - -EXTRA_OECONF = "--disable-coverage --enable-relative-plugins" -CFLAGS_append = " -Wno-error=pedantic" - -do_populate_cve_db() { - if [ "${BB_NO_NETWORK}" = "1" ] ; then - bbwarn "BB_NO_NETWORK is set; Can't update cve-check-tool database, new CVEs won't be detected" - return - fi - - # In case we don't inherit cve-check class, use default values defined in the class. - cve_dir="${CVE_CHECK_DB_DIR}" - cve_file="${CVE_CHECK_TMP_FILE}" - - [ -z "${cve_dir}" ] && cve_dir="${DL_DIR}/CVE_CHECK" - [ -z "${cve_file}" ] && cve_file="${TMPDIR}/cve_check" - - unused="${@bb.utils.export_proxies(d)}" - bbdebug 2 "Updating cve-check-tool database located in $cve_dir" - # --cacert works around curl-native not finding the CA bundle - if cve-check-update --cacert ${sysconfdir}/ssl/certs/ca-certificates.crt -d "$cve_dir" ; then - printf "CVE database was updated on %s UTC\n\n" "$(LANG=C date --utc +'%F %T')" > "$cve_file" - else - bbwarn "Error in executing cve-check-update" - if [ "${@'1' if bb.data.inherits_class('cve-check', d) else '0'}" -ne 0 ] ; then - bbwarn "Failed to update cve-check-tool database, CVEs won't be checked" - fi - fi -} - -addtask populate_cve_db after do_populate_sysroot -do_populate_cve_db[depends] = "cve-check-tool-native:do_populate_sysroot" -do_populate_cve_db[nostamp] = "1" -do_populate_cve_db[progress] = "percent" - -BBCLASSEXTEND = "native nativesdk" diff --git a/meta/recipes-devtools/cve-check-tool/files/0001-Fix-freeing-memory-allocated-by-sqlite.patch b/meta/recipes-devtools/cve-check-tool/files/0001-Fix-freeing-memory-allocated-by-sqlite.patch deleted file mode 100644 index 4a82cf2dded..00000000000 --- a/meta/recipes-devtools/cve-check-tool/files/0001-Fix-freeing-memory-allocated-by-sqlite.patch +++ /dev/null @@ -1,50 +0,0 @@ -From a3353429652f83bb8b0316500faa88fa2555542d Mon Sep 17 00:00:00 2001 -From: Peter Marko <peter.marko@siemens.com> -Date: Thu, 13 Apr 2017 23:09:52 +0200 -Subject: [PATCH] Fix freeing memory allocated by sqlite - -Upstream-Status: Backport -Signed-off-by: Peter Marko <peter.marko@siemens.com> ---- - src/core.c | 8 ++++---- - 1 file changed, 4 insertions(+), 4 deletions(-) - -diff --git a/src/core.c b/src/core.c -index 6263031..6788f16 100644 ---- a/src/core.c -+++ b/src/core.c -@@ -82,7 +82,7 @@ static bool ensure_table(CveDB *self) - rc = sqlite3_exec(self->db, query, NULL, NULL, &err); - if (rc != SQLITE_OK) { - fprintf(stderr, "ensure_table(): %s\n", err); -- free(err); -+ sqlite3_free(err); - return false; - } - -@@ -91,7 +91,7 @@ static bool ensure_table(CveDB *self) - rc = sqlite3_exec(self->db, query, NULL, NULL, &err); - if (rc != SQLITE_OK) { - fprintf(stderr, "ensure_table(): %s\n", err); -- free(err); -+ sqlite3_free(err); - return false; - } - -@@ -99,11 +99,11 @@ static bool ensure_table(CveDB *self) - rc = sqlite3_exec(self->db, query, NULL, NULL, &err); - if (rc != SQLITE_OK) { - fprintf(stderr, "ensure_table(): %s\n", err); -- free(err); -+ sqlite3_free(err); - return false; - } - if (err) { -- free(err); -+ sqlite3_free(err); - } - - return true; --- -2.1.4 - diff --git a/meta/recipes-devtools/cve-check-tool/files/0001-curl-allow-overriding-default-CA-certificate-file.patch b/meta/recipes-devtools/cve-check-tool/files/0001-curl-allow-overriding-default-CA-certificate-file.patch deleted file mode 100644 index 3d8ebd1bd26..00000000000 --- a/meta/recipes-devtools/cve-check-tool/files/0001-curl-allow-overriding-default-CA-certificate-file.patch +++ /dev/null @@ -1,215 +0,0 @@ -From 825a9969dea052b02ba868bdf39e676349f10dce Mon Sep 17 00:00:00 2001 -From: Jussi Kukkonen <jussi.kukkonen@intel.com> -Date: Thu, 9 Feb 2017 14:51:28 +0200 -Subject: [PATCH] curl: allow overriding default CA certificate file - -Similar to curl, --cacert can now be used in cve-check-tool and -cve-check-update to override the default CA certificate file. Useful -in cases where the system default is unsuitable (for example, -out-dated) or broken (as in OE's current native libcurl, which embeds -a path string from one build host and then uses it on another although -the right path may have become something different). - -Upstream-Status: Submitted [https://github.com/ikeydoherty/cve-check-tool/pull/45] - -Signed-off-by: Patrick Ohly <patrick.ohly@intel.com> - - -Took Patrick Ohlys original patch from meta-security-isafw, rebased -on top of other patches. - -Signed-off-by: Jussi Kukkonen <jussi.kukkonen@intel.com> ---- - src/library/cve-check-tool.h | 1 + - src/library/fetch.c | 10 +++++++++- - src/library/fetch.h | 3 ++- - src/main.c | 5 ++++- - src/update-main.c | 4 +++- - src/update.c | 12 +++++++----- - src/update.h | 2 +- - 7 files changed, 27 insertions(+), 10 deletions(-) - -diff --git a/src/library/cve-check-tool.h b/src/library/cve-check-tool.h -index e4bb5b1..f89eade 100644 ---- a/src/library/cve-check-tool.h -+++ b/src/library/cve-check-tool.h -@@ -43,6 +43,7 @@ typedef struct CveCheckTool { - bool bugs; /**<Whether bug tracking is enabled */ - GHashTable *mapping; /**<CVE Mapping */ - const char *output_file; /**<Output file, if any */ -+ const char *cacert_file; /**<Non-default SSL certificate file, if any */ - } CveCheckTool; - - /** -diff --git a/src/library/fetch.c b/src/library/fetch.c -index 0fe6d76..8f998c3 100644 ---- a/src/library/fetch.c -+++ b/src/library/fetch.c -@@ -60,7 +60,8 @@ static int progress_callback_new(void *ptr, curl_off_t dltotal, curl_off_t dlnow - } - - FetchStatus fetch_uri(const char *uri, const char *target, bool verbose, -- unsigned int start_percent, unsigned int end_percent) -+ unsigned int start_percent, unsigned int end_percent, -+ const char *cacert_file) - { - FetchStatus ret = FETCH_STATUS_FAIL; - CURLcode res; -@@ -74,6 +75,13 @@ FetchStatus fetch_uri(const char *uri, const char *target, bool verbose, - return ret; - } - -+ if (cacert_file) { -+ res = curl_easy_setopt(curl, CURLOPT_CAINFO, cacert_file); -+ if (res != CURLE_OK) { -+ goto bail; -+ } -+ } -+ - if (stat(target, &st) == 0) { - res = curl_easy_setopt(curl, CURLOPT_TIMECONDITION, CURL_TIMECOND_IFMODSINCE); - if (res != CURLE_OK) { -diff --git a/src/library/fetch.h b/src/library/fetch.h -index 4cce5d1..836c7d7 100644 ---- a/src/library/fetch.h -+++ b/src/library/fetch.h -@@ -29,7 +29,8 @@ typedef enum { - * @return A FetchStatus, indicating the operation taken - */ - FetchStatus fetch_uri(const char *uri, const char *target, bool verbose, -- unsigned int this_percent, unsigned int next_percent); -+ unsigned int this_percent, unsigned int next_percent, -+ const char *cacert_file); - - /** - * Attempt to extract the given gzipped file -diff --git a/src/main.c b/src/main.c -index 8e6f158..ae69d47 100644 ---- a/src/main.c -+++ b/src/main.c -@@ -280,6 +280,7 @@ static bool csv_mode = false; - static char *modified_stamp = NULL; - static gchar *mapping_file = NULL; - static gchar *output_file = NULL; -+static gchar *cacert_file = NULL; - - static GOptionEntry _entries[] = { - { "not-patched", 'n', 0, G_OPTION_ARG_NONE, &hide_patched, "Hide patched/addressed CVEs", NULL }, -@@ -294,6 +295,7 @@ static GOptionEntry _entries[] = { - { "csv", 'c', 0, G_OPTION_ARG_NONE, &csv_mode, "Output CSV formatted data only", NULL }, - { "mapping", 'M', 0, G_OPTION_ARG_STRING, &mapping_file, "Path to a mapping file", NULL}, - { "output-file", 'o', 0, G_OPTION_ARG_STRING, &output_file, "Path to the output file (output plugin specific)", NULL}, -+ { "cacert", 'C', 0, G_OPTION_ARG_STRING, &cacert_file, "Path to the combined SSL certificates file (system default is used if not set)", NULL}, - { .short_name = 0 } - }; - -@@ -492,6 +494,7 @@ int main(int argc, char **argv) - - quiet = csv_mode || !no_html; - self->output_file = output_file; -+ self->cacert_file = cacert_file; - - if (!csv_mode && self->output_file) { - quiet = false; -@@ -530,7 +533,7 @@ int main(int argc, char **argv) - if (status) { - fprintf(stderr, "Update of db forced\n"); - cve_db_unlock(); -- if (!update_db(quiet, db_path->str)) { -+ if (!update_db(quiet, db_path->str, self->cacert_file)) { - fprintf(stderr, "DB update failure\n"); - goto cleanup; - } -diff --git a/src/update-main.c b/src/update-main.c -index 2379cfa..c52d9d0 100644 ---- a/src/update-main.c -+++ b/src/update-main.c -@@ -43,11 +43,13 @@ the Free Software Foundation; either version 2 of the License, or\n\ - static gchar *nvds = NULL; - static bool _show_version = false; - static bool _quiet = false; -+static const char *_cacert_file = NULL; - - static GOptionEntry _entries[] = { - { "nvd-dir", 'd', 0, G_OPTION_ARG_STRING, &nvds, "NVD directory in filesystem", NULL }, - { "version", 'v', 0, G_OPTION_ARG_NONE, &_show_version, "Show version", NULL }, - { "quiet", 'q', 0, G_OPTION_ARG_NONE, &_quiet, "Run silently", NULL }, -+ { "cacert", 'C', 0, G_OPTION_ARG_STRING, &_cacert_file, "Path to the combined SSL certificates file (system default is used if not set)", NULL}, - { .short_name = 0 } - }; - -@@ -88,7 +90,7 @@ int main(int argc, char **argv) - goto end; - } - -- if (update_db(_quiet, db_path->str)) { -+ if (update_db(_quiet, db_path->str, _cacert_file)) { - ret = EXIT_SUCCESS; - } else { - fprintf(stderr, "Failed to update database\n"); -diff --git a/src/update.c b/src/update.c -index 070560a..8cb4a39 100644 ---- a/src/update.c -+++ b/src/update.c -@@ -267,7 +267,8 @@ static inline void update_end(int fd, const char *update_fname, bool ok) - - static int do_fetch_update(int year, const char *db_dir, CveDB *cve_db, - bool db_exist, bool verbose, -- unsigned int this_percent, unsigned int next_percent) -+ unsigned int this_percent, unsigned int next_percent, -+ const char *cacert_file) - { - const char nvd_uri[] = URI_PREFIX; - autofree(cve_string) *uri_meta = NULL; -@@ -331,14 +332,14 @@ refetch: - } - - /* Fetch NVD META file */ -- st = fetch_uri(uri_meta->str, nvdcve_meta->str, verbose, this_percent, this_percent); -+ st = fetch_uri(uri_meta->str, nvdcve_meta->str, verbose, this_percent, this_percent, cacert_file); - if (st == FETCH_STATUS_FAIL) { - fprintf(stderr, "Failed to fetch %s\n", uri_meta->str); - return -1; - } - - /* Fetch NVD XML file */ -- st = fetch_uri(uri_data_gz->str, nvdcve_data_gz->str, verbose, this_percent, next_percent); -+ st = fetch_uri(uri_data_gz->str, nvdcve_data_gz->str, verbose, this_percent, next_percent, cacert_file); - switch (st) { - case FETCH_STATUS_FAIL: - fprintf(stderr, "Failed to fetch %s\n", uri_data_gz->str); -@@ -391,7 +392,7 @@ refetch: - return 0; - } - --bool update_db(bool quiet, const char *db_file) -+bool update_db(bool quiet, const char *db_file, const char *cacert_file) - { - autofree(char) *db_dir = NULL; - autofree(CveDB) *cve_db = NULL; -@@ -466,7 +467,8 @@ bool update_db(bool quiet, const char *db_file) - if (!quiet) - fprintf(stderr, "completed: %u%%\r", start_percent); - rc = do_fetch_update(y, db_dir, cve_db, db_exist, !quiet, -- start_percent, end_percent); -+ start_percent, end_percent, -+ cacert_file); - switch (rc) { - case 0: - if (!quiet) -diff --git a/src/update.h b/src/update.h -index b8e9911..ceea0c3 100644 ---- a/src/update.h -+++ b/src/update.h -@@ -15,7 +15,7 @@ cve_string *get_db_path(const char *path); - - int update_required(const char *db_file); - --bool update_db(bool quiet, const char *db_file); -+bool update_db(bool quiet, const char *db_file, const char *cacert_file); - - - /* --- -2.1.4 - diff --git a/meta/recipes-devtools/cve-check-tool/files/0001-print-progress-in-percent-when-downloading-CVE-db.patch b/meta/recipes-devtools/cve-check-tool/files/0001-print-progress-in-percent-when-downloading-CVE-db.patch deleted file mode 100644 index 8ea6f686e3f..00000000000 --- a/meta/recipes-devtools/cve-check-tool/files/0001-print-progress-in-percent-when-downloading-CVE-db.patch +++ /dev/null @@ -1,135 +0,0 @@ -From e9ed26cde63f8ca7607a010a518329339f8c02d3 Mon Sep 17 00:00:00 2001 -From: =?UTF-8?q?Andr=C3=A9=20Draszik?= <git@andred.net> -Date: Mon, 26 Sep 2016 12:12:41 +0100 -Subject: [PATCH] print progress in percent when downloading CVE db -MIME-Version: 1.0 -Content-Type: text/plain; charset=UTF-8 -Content-Transfer-Encoding: 8bit - -Upstream-Status: Pending -Signed-off-by: André Draszik <git@andred.net> ---- - src/library/fetch.c | 28 +++++++++++++++++++++++++++- - src/library/fetch.h | 3 ++- - src/update.c | 16 ++++++++++++---- - 3 files changed, 41 insertions(+), 6 deletions(-) - -diff --git a/src/library/fetch.c b/src/library/fetch.c -index 06d4b30..0fe6d76 100644 ---- a/src/library/fetch.c -+++ b/src/library/fetch.c -@@ -37,13 +37,37 @@ static size_t write_func(void *ptr, size_t size, size_t nmemb, struct fetch_t *f - return fwrite(ptr, size, nmemb, f->f); - } - --FetchStatus fetch_uri(const char *uri, const char *target, bool verbose) -+struct percent_t { -+ unsigned int start; -+ unsigned int end; -+}; -+ -+static int progress_callback_new(void *ptr, curl_off_t dltotal, curl_off_t dlnow, curl_off_t ultotal, curl_off_t ulnow) -+{ -+ (void) ultotal; -+ (void) ulnow; -+ -+ struct percent_t *percent = (struct percent_t *) ptr; -+ -+ if (dltotal && percent && percent->end >= percent->start) { -+ unsigned int diff = percent->end - percent->start; -+ if (diff) { -+ fprintf(stderr,"completed: %"CURL_FORMAT_CURL_OFF_T"%%\r", percent->start + (diff * dlnow / dltotal)); -+ } -+ } -+ -+ return 0; -+} -+ -+FetchStatus fetch_uri(const char *uri, const char *target, bool verbose, -+ unsigned int start_percent, unsigned int end_percent) - { - FetchStatus ret = FETCH_STATUS_FAIL; - CURLcode res; - struct stat st; - CURL *curl = NULL; - struct fetch_t *f = NULL; -+ struct percent_t percent = { .start = start_percent, .end = end_percent }; - - curl = curl_easy_init(); - if (!curl) { -@@ -67,6 +91,8 @@ FetchStatus fetch_uri(const char *uri, const char *target, bool verbose) - } - if (verbose) { - (void)curl_easy_setopt(curl, CURLOPT_NOPROGRESS, 0L); -+ (void)curl_easy_setopt(curl, CURLOPT_XFERINFODATA, &percent); -+ (void)curl_easy_setopt(curl, CURLOPT_XFERINFOFUNCTION, progress_callback_new); - } - res = curl_easy_setopt(curl, CURLOPT_WRITEFUNCTION, (curl_write_callback)write_func); - if (res != CURLE_OK) { -diff --git a/src/library/fetch.h b/src/library/fetch.h -index 70c3779..4cce5d1 100644 ---- a/src/library/fetch.h -+++ b/src/library/fetch.h -@@ -28,7 +28,8 @@ typedef enum { - * @param verbose Whether to be verbose - * @return A FetchStatus, indicating the operation taken - */ --FetchStatus fetch_uri(const char *uri, const char *target, bool verbose); -+FetchStatus fetch_uri(const char *uri, const char *target, bool verbose, -+ unsigned int this_percent, unsigned int next_percent); - - /** - * Attempt to extract the given gzipped file -diff --git a/src/update.c b/src/update.c -index 30fbe96..eaeeefd 100644 ---- a/src/update.c -+++ b/src/update.c -@@ -266,7 +266,8 @@ static inline void update_end(int fd, const char *update_fname, bool ok) - } - - static int do_fetch_update(int year, const char *db_dir, CveDB *cve_db, -- bool db_exist, bool verbose) -+ bool db_exist, bool verbose, -+ unsigned int this_percent, unsigned int next_percent) - { - const char nvd_uri[] = URI_PREFIX; - autofree(cve_string) *uri_meta = NULL; -@@ -330,14 +331,14 @@ refetch: - } - - /* Fetch NVD META file */ -- st = fetch_uri(uri_meta->str, nvdcve_meta->str, verbose); -+ st = fetch_uri(uri_meta->str, nvdcve_meta->str, verbose, this_percent, this_percent); - if (st == FETCH_STATUS_FAIL) { - fprintf(stderr, "Failed to fetch %s\n", uri_meta->str); - return -1; - } - - /* Fetch NVD XML file */ -- st = fetch_uri(uri_data_gz->str, nvdcve_data_gz->str, verbose); -+ st = fetch_uri(uri_data_gz->str, nvdcve_data_gz->str, verbose, this_percent, next_percent); - switch (st) { - case FETCH_STATUS_FAIL: - fprintf(stderr, "Failed to fetch %s\n", uri_data_gz->str); -@@ -459,10 +460,17 @@ bool update_db(bool quiet, const char *db_file) - for (int i = YEAR_START; i <= year+1; i++) { - int y = i > year ? -1 : i; - int rc; -+ unsigned int start_percent = ((i+0 - YEAR_START) * 100) / (year+2 - YEAR_START); -+ unsigned int end_percent = ((i+1 - YEAR_START) * 100) / (year+2 - YEAR_START); - -- rc = do_fetch_update(y, db_dir, cve_db, db_exist, !quiet); -+ if (!quiet) -+ fprintf(stderr, "completed: %u%%\r", start_percent); -+ rc = do_fetch_update(y, db_dir, cve_db, db_exist, !quiet, -+ start_percent, end_percent); - switch (rc) { - case 0: -+ if (!quiet) -+ fprintf(stderr,"completed: %u%%\r", end_percent); - continue; - case ENOMEM: - goto oom; --- -2.9.3 - diff --git a/meta/recipes-devtools/cve-check-tool/files/0001-update-Compare-computed-vs-expected-sha256-digit-str.patch b/meta/recipes-devtools/cve-check-tool/files/0001-update-Compare-computed-vs-expected-sha256-digit-str.patch deleted file mode 100644 index 458c0cc84e5..00000000000 --- a/meta/recipes-devtools/cve-check-tool/files/0001-update-Compare-computed-vs-expected-sha256-digit-str.patch +++ /dev/null @@ -1,52 +0,0 @@ -From b0426e63c9ac61657e029f689bcb8dd051e752c6 Mon Sep 17 00:00:00 2001 -From: Sergey Popovich <popovich_sergei@mail.ua> -Date: Fri, 21 Apr 2017 07:32:23 -0700 -Subject: [PATCH] update: Compare computed vs expected sha256 digit string - ignoring case - -We produce sha256 digest string using %x snprintf() -qualifier for each byte of digest which uses alphabetic -characters from "a" to "f" in lower case to represent -integer values from 10 to 15. - -Previously all of the NVD META files supply sha256 -digest string for corresponding XML file in lower case. - -However due to some reason this changed recently to -provide digest digits in upper case causing fetched -data consistency checks to fail. This prevents database -from being updated periodically. - -While commit c4f6e94 (update: Do not treat sha256 failure -as fatal if requested) adds useful option to skip -digest validation at all and thus provides workaround for -this situation, it might be unacceptable for some -deployments where we need to ensure that downloaded -data is consistent before start parsing it and update -SQLite database. - -Use strcasecmp() to compare two digest strings case -insensitively and addressing this case. - -Upstream-Status: Backport -Signed-off-by: Sergey Popovich <popovich_sergei@mail.ua> ---- - src/update.c | 2 +- - 1 file changed, 1 insertion(+), 1 deletion(-) - -diff --git a/src/update.c b/src/update.c -index 8588f38..3cc6b67 100644 ---- a/src/update.c -+++ b/src/update.c -@@ -187,7 +187,7 @@ static bool nvdcve_data_ok(const char *meta, const char *data) - snprintf(&csum_data[idx], len, "%02hhx", digest[i]); - } - -- ret = streq(csum_meta, csum_data); -+ ret = !strcasecmp(csum_meta, csum_data); - - err_unmap: - munmap(buffer, length); --- -2.11.0 - diff --git a/meta/recipes-devtools/cve-check-tool/files/check-for-malloc_trim-before-using-it.patch b/meta/recipes-devtools/cve-check-tool/files/check-for-malloc_trim-before-using-it.patch deleted file mode 100644 index 0774ad946a4..00000000000 --- a/meta/recipes-devtools/cve-check-tool/files/check-for-malloc_trim-before-using-it.patch +++ /dev/null @@ -1,51 +0,0 @@ -From ce64633b9733e962b8d8482244301f614d8b5845 Mon Sep 17 00:00:00 2001 -From: Khem Raj <raj.khem@gmail.com> -Date: Mon, 22 Aug 2016 22:54:24 -0700 -Subject: [PATCH] Check for malloc_trim before using it - -malloc_trim is gnu specific and not all libc -implement it, threfore write a configure check -to poke for it first and use the define to -guard its use. - -Helps in compiling on musl based systems - -Signed-off-by: Khem Raj <raj.khem@gmail.com> ---- -Upstream-Status: Submitted [https://github.com/ikeydoherty/cve-check-tool/pull/48] - configure.ac | 2 ++ - src/core.c | 4 ++-- - 2 files changed, 4 insertions(+), 2 deletions(-) - -diff --git a/configure.ac b/configure.ac -index d3b66ce..79c3542 100644 ---- a/configure.ac -+++ b/configure.ac -@@ -19,6 +19,8 @@ m4_define([json_required_version], [0.16.0]) - m4_define([openssl_required_version],[1.0.0]) - # TODO: Set minimum sqlite - -+AC_CHECK_FUNCS_ONCE(malloc_trim) -+ - PKG_CHECK_MODULES(CVE_CHECK_TOOL, - [ - glib-2.0 >= glib_required_version, -diff --git a/src/core.c b/src/core.c -index 6263031..0d5df29 100644 ---- a/src/core.c -+++ b/src/core.c -@@ -498,9 +498,9 @@ bool cve_db_load(CveDB *self, const char *fname) - } - - b = true; -- -+#ifdef HAVE_MALLOC_TRIM - malloc_trim(0); -- -+#endif - xmlFreeTextReader(r); - if (fd) { - close(fd); --- -2.9.3 -
As detailed at [1] the XML feeds provided by NIST are being discontinued on October 9th 2019. As cve-check-tool uses these feeds, cve-check.bbclass will be inoperable after this date. To ensure that cve-check continues working, backport the following commits from master to move away from the unmaintained cve-check-tool to our own Python code that fetches the JSON: 546d14135c5 cve-update-db: New recipe to update CVE database bc144b028f6 cve-check: Remove dependency to cve-check-tool-native 7f62a20b32a cve-check: Manage CVE_PRODUCT with more than one name 3bf63bc6084 cve-check: Consider CVE that affects versions with less than operator c0eabd30d7b cve-update-db: Use std library instead of urllib3 27eb839ee65 cve-check: be idiomatic 09be21f4d17 cve-update-db: Manage proxy if needed. 975793e3825 cve-update-db: do_populate_cve_db depends on do_fetch 0325dd72714 cve-update-db: Catch request.urlopen errors. 4078da92b49 cve-check: Depends on cve-update-db-native f7676e9a38d cve-update-db: Use NVD CPE data to populate PRODUCTS table bc0195be1b1 cve-check: Update unpatched CVE matching c807c2a6409 cve-update-db-native: Skip recipe when cve-check class is not loaded. 07bb8b25e17 cve-check: remove redundant readline CVE whitelisting 5388ed6d137 cve-check-tool: remove 270ac00cb43 cve-check.bbclass: initialize to_append e6bf9000987 cve-check: allow comparison of Vendor as well as Product 91770338f76 cve-update-db-native: use SQL placeholders instead of format strings 7069302a4cc cve-check: Replace CVE_CHECK_CVE_WHITELIST by CVE_CHECK_WHITELIST 78de2cb39d7 cve-update-db-native: Remove hash column from database. 4b301030cf9 cve-update-db-native: use os.path.join instead of + f0d822fad2a cve-update-db: actually inherit native b309840b6aa cve-update-db-native: use executemany() to optimise CPE insertion bb4e53af33d cve-update-db-native: improve metadata parsing 94227459792 cve-update-db-native: clean up JSON fetching 95438d52b73 cve-update-db-native: fix https proxy issues 1f9a963b9ff glibc: exclude child recipes from CVE scanning [1] https://nvd.nist.gov/General/News/XML-Vulnerability-Feed-Retirement Signed-off-by: Ross Burton <ross.burton@intel.com> --- meta/classes/cve-check.bbclass | 142 +++++++----- meta/conf/distro/include/maintainers.inc | 1 + meta/recipes-core/glibc/glibc-locale.inc | 3 + meta/recipes-core/glibc/glibc-mtrace.inc | 3 + meta/recipes-core/glibc/glibc-scripts.inc | 3 + .../recipes-core/meta/cve-update-db-native.bb | 195 ++++++++++++++++ .../cve-check-tool/cve-check-tool_5.6.4.bb | 62 ----- ...x-freeing-memory-allocated-by-sqlite.patch | 50 ---- ...erriding-default-CA-certificate-file.patch | 215 ------------------ ...s-in-percent-when-downloading-CVE-db.patch | 135 ----------- ...omputed-vs-expected-sha256-digit-str.patch | 52 ----- ...heck-for-malloc_trim-before-using-it.patch | 51 ----- 12 files changed, 292 insertions(+), 620 deletions(-) create mode 100644 meta/recipes-core/meta/cve-update-db-native.bb delete mode 100644 meta/recipes-devtools/cve-check-tool/cve-check-tool_5.6.4.bb delete mode 100644 meta/recipes-devtools/cve-check-tool/files/0001-Fix-freeing-memory-allocated-by-sqlite.patch delete mode 100644 meta/recipes-devtools/cve-check-tool/files/0001-curl-allow-overriding-default-CA-certificate-file.patch delete mode 100644 meta/recipes-devtools/cve-check-tool/files/0001-print-progress-in-percent-when-downloading-CVE-db.patch delete mode 100644 meta/recipes-devtools/cve-check-tool/files/0001-update-Compare-computed-vs-expected-sha256-digit-str.patch delete mode 100644 meta/recipes-devtools/cve-check-tool/files/check-for-malloc_trim-before-using-it.patch