mirror of
https://github.com/speed47/spectre-meltdown-checker.git
synced 2026-05-13 19:03:20 +02:00
Compare commits
6 Commits
v0.14
..
vuln-watch
| Author | SHA1 | Date | |
|---|---|---|---|
| 7f5256f15e | |||
| 7a3224ad61 | |||
| 31cf549c75 | |||
| b305cc48c3 | |||
| 12f545dc45 | |||
| 94356c4992 |
@@ -0,0 +1,4 @@
|
|||||||
|
__pycache__/
|
||||||
|
*.py[cod]
|
||||||
|
*.egg-info/
|
||||||
|
.venv/
|
||||||
@@ -1,674 +0,0 @@
|
|||||||
GNU GENERAL PUBLIC LICENSE
|
|
||||||
Version 3, 29 June 2007
|
|
||||||
|
|
||||||
Copyright (C) 2007 Free Software Foundation, Inc. <http://fsf.org/>
|
|
||||||
Everyone is permitted to copy and distribute verbatim copies
|
|
||||||
of this license document, but changing it is not allowed.
|
|
||||||
|
|
||||||
Preamble
|
|
||||||
|
|
||||||
The GNU General Public License is a free, copyleft license for
|
|
||||||
software and other kinds of works.
|
|
||||||
|
|
||||||
The licenses for most software and other practical works are designed
|
|
||||||
to take away your freedom to share and change the works. By contrast,
|
|
||||||
the GNU General Public License is intended to guarantee your freedom to
|
|
||||||
share and change all versions of a program--to make sure it remains free
|
|
||||||
software for all its users. We, the Free Software Foundation, use the
|
|
||||||
GNU General Public License for most of our software; it applies also to
|
|
||||||
any other work released this way by its authors. You can apply it to
|
|
||||||
your programs, too.
|
|
||||||
|
|
||||||
When we speak of free software, we are referring to freedom, not
|
|
||||||
price. Our General Public Licenses are designed to make sure that you
|
|
||||||
have the freedom to distribute copies of free software (and charge for
|
|
||||||
them if you wish), that you receive source code or can get it if you
|
|
||||||
want it, that you can change the software or use pieces of it in new
|
|
||||||
free programs, and that you know you can do these things.
|
|
||||||
|
|
||||||
To protect your rights, we need to prevent others from denying you
|
|
||||||
these rights or asking you to surrender the rights. Therefore, you have
|
|
||||||
certain responsibilities if you distribute copies of the software, or if
|
|
||||||
you modify it: responsibilities to respect the freedom of others.
|
|
||||||
|
|
||||||
For example, if you distribute copies of such a program, whether
|
|
||||||
gratis or for a fee, you must pass on to the recipients the same
|
|
||||||
freedoms that you received. You must make sure that they, too, receive
|
|
||||||
or can get the source code. And you must show them these terms so they
|
|
||||||
know their rights.
|
|
||||||
|
|
||||||
Developers that use the GNU GPL protect your rights with two steps:
|
|
||||||
(1) assert copyright on the software, and (2) offer you this License
|
|
||||||
giving you legal permission to copy, distribute and/or modify it.
|
|
||||||
|
|
||||||
For the developers' and authors' protection, the GPL clearly explains
|
|
||||||
that there is no warranty for this free software. For both users' and
|
|
||||||
authors' sake, the GPL requires that modified versions be marked as
|
|
||||||
changed, so that their problems will not be attributed erroneously to
|
|
||||||
authors of previous versions.
|
|
||||||
|
|
||||||
Some devices are designed to deny users access to install or run
|
|
||||||
modified versions of the software inside them, although the manufacturer
|
|
||||||
can do so. This is fundamentally incompatible with the aim of
|
|
||||||
protecting users' freedom to change the software. The systematic
|
|
||||||
pattern of such abuse occurs in the area of products for individuals to
|
|
||||||
use, which is precisely where it is most unacceptable. Therefore, we
|
|
||||||
have designed this version of the GPL to prohibit the practice for those
|
|
||||||
products. If such problems arise substantially in other domains, we
|
|
||||||
stand ready to extend this provision to those domains in future versions
|
|
||||||
of the GPL, as needed to protect the freedom of users.
|
|
||||||
|
|
||||||
Finally, every program is threatened constantly by software patents.
|
|
||||||
States should not allow patents to restrict development and use of
|
|
||||||
software on general-purpose computers, but in those that do, we wish to
|
|
||||||
avoid the special danger that patents applied to a free program could
|
|
||||||
make it effectively proprietary. To prevent this, the GPL assures that
|
|
||||||
patents cannot be used to render the program non-free.
|
|
||||||
|
|
||||||
The precise terms and conditions for copying, distribution and
|
|
||||||
modification follow.
|
|
||||||
|
|
||||||
TERMS AND CONDITIONS
|
|
||||||
|
|
||||||
0. Definitions.
|
|
||||||
|
|
||||||
"This License" refers to version 3 of the GNU General Public License.
|
|
||||||
|
|
||||||
"Copyright" also means copyright-like laws that apply to other kinds of
|
|
||||||
works, such as semiconductor masks.
|
|
||||||
|
|
||||||
"The Program" refers to any copyrightable work licensed under this
|
|
||||||
License. Each licensee is addressed as "you". "Licensees" and
|
|
||||||
"recipients" may be individuals or organizations.
|
|
||||||
|
|
||||||
To "modify" a work means to copy from or adapt all or part of the work
|
|
||||||
in a fashion requiring copyright permission, other than the making of an
|
|
||||||
exact copy. The resulting work is called a "modified version" of the
|
|
||||||
earlier work or a work "based on" the earlier work.
|
|
||||||
|
|
||||||
A "covered work" means either the unmodified Program or a work based
|
|
||||||
on the Program.
|
|
||||||
|
|
||||||
To "propagate" a work means to do anything with it that, without
|
|
||||||
permission, would make you directly or secondarily liable for
|
|
||||||
infringement under applicable copyright law, except executing it on a
|
|
||||||
computer or modifying a private copy. Propagation includes copying,
|
|
||||||
distribution (with or without modification), making available to the
|
|
||||||
public, and in some countries other activities as well.
|
|
||||||
|
|
||||||
To "convey" a work means any kind of propagation that enables other
|
|
||||||
parties to make or receive copies. Mere interaction with a user through
|
|
||||||
a computer network, with no transfer of a copy, is not conveying.
|
|
||||||
|
|
||||||
An interactive user interface displays "Appropriate Legal Notices"
|
|
||||||
to the extent that it includes a convenient and prominently visible
|
|
||||||
feature that (1) displays an appropriate copyright notice, and (2)
|
|
||||||
tells the user that there is no warranty for the work (except to the
|
|
||||||
extent that warranties are provided), that licensees may convey the
|
|
||||||
work under this License, and how to view a copy of this License. If
|
|
||||||
the interface presents a list of user commands or options, such as a
|
|
||||||
menu, a prominent item in the list meets this criterion.
|
|
||||||
|
|
||||||
1. Source Code.
|
|
||||||
|
|
||||||
The "source code" for a work means the preferred form of the work
|
|
||||||
for making modifications to it. "Object code" means any non-source
|
|
||||||
form of a work.
|
|
||||||
|
|
||||||
A "Standard Interface" means an interface that either is an official
|
|
||||||
standard defined by a recognized standards body, or, in the case of
|
|
||||||
interfaces specified for a particular programming language, one that
|
|
||||||
is widely used among developers working in that language.
|
|
||||||
|
|
||||||
The "System Libraries" of an executable work include anything, other
|
|
||||||
than the work as a whole, that (a) is included in the normal form of
|
|
||||||
packaging a Major Component, but which is not part of that Major
|
|
||||||
Component, and (b) serves only to enable use of the work with that
|
|
||||||
Major Component, or to implement a Standard Interface for which an
|
|
||||||
implementation is available to the public in source code form. A
|
|
||||||
"Major Component", in this context, means a major essential component
|
|
||||||
(kernel, window system, and so on) of the specific operating system
|
|
||||||
(if any) on which the executable work runs, or a compiler used to
|
|
||||||
produce the work, or an object code interpreter used to run it.
|
|
||||||
|
|
||||||
The "Corresponding Source" for a work in object code form means all
|
|
||||||
the source code needed to generate, install, and (for an executable
|
|
||||||
work) run the object code and to modify the work, including scripts to
|
|
||||||
control those activities. However, it does not include the work's
|
|
||||||
System Libraries, or general-purpose tools or generally available free
|
|
||||||
programs which are used unmodified in performing those activities but
|
|
||||||
which are not part of the work. For example, Corresponding Source
|
|
||||||
includes interface definition files associated with source files for
|
|
||||||
the work, and the source code for shared libraries and dynamically
|
|
||||||
linked subprograms that the work is specifically designed to require,
|
|
||||||
such as by intimate data communication or control flow between those
|
|
||||||
subprograms and other parts of the work.
|
|
||||||
|
|
||||||
The Corresponding Source need not include anything that users
|
|
||||||
can regenerate automatically from other parts of the Corresponding
|
|
||||||
Source.
|
|
||||||
|
|
||||||
The Corresponding Source for a work in source code form is that
|
|
||||||
same work.
|
|
||||||
|
|
||||||
2. Basic Permissions.
|
|
||||||
|
|
||||||
All rights granted under this License are granted for the term of
|
|
||||||
copyright on the Program, and are irrevocable provided the stated
|
|
||||||
conditions are met. This License explicitly affirms your unlimited
|
|
||||||
permission to run the unmodified Program. The output from running a
|
|
||||||
covered work is covered by this License only if the output, given its
|
|
||||||
content, constitutes a covered work. This License acknowledges your
|
|
||||||
rights of fair use or other equivalent, as provided by copyright law.
|
|
||||||
|
|
||||||
You may make, run and propagate covered works that you do not
|
|
||||||
convey, without conditions so long as your license otherwise remains
|
|
||||||
in force. You may convey covered works to others for the sole purpose
|
|
||||||
of having them make modifications exclusively for you, or provide you
|
|
||||||
with facilities for running those works, provided that you comply with
|
|
||||||
the terms of this License in conveying all material for which you do
|
|
||||||
not control copyright. Those thus making or running the covered works
|
|
||||||
for you must do so exclusively on your behalf, under your direction
|
|
||||||
and control, on terms that prohibit them from making any copies of
|
|
||||||
your copyrighted material outside their relationship with you.
|
|
||||||
|
|
||||||
Conveying under any other circumstances is permitted solely under
|
|
||||||
the conditions stated below. Sublicensing is not allowed; section 10
|
|
||||||
makes it unnecessary.
|
|
||||||
|
|
||||||
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
|
|
||||||
|
|
||||||
No covered work shall be deemed part of an effective technological
|
|
||||||
measure under any applicable law fulfilling obligations under article
|
|
||||||
11 of the WIPO copyright treaty adopted on 20 December 1996, or
|
|
||||||
similar laws prohibiting or restricting circumvention of such
|
|
||||||
measures.
|
|
||||||
|
|
||||||
When you convey a covered work, you waive any legal power to forbid
|
|
||||||
circumvention of technological measures to the extent such circumvention
|
|
||||||
is effected by exercising rights under this License with respect to
|
|
||||||
the covered work, and you disclaim any intention to limit operation or
|
|
||||||
modification of the work as a means of enforcing, against the work's
|
|
||||||
users, your or third parties' legal rights to forbid circumvention of
|
|
||||||
technological measures.
|
|
||||||
|
|
||||||
4. Conveying Verbatim Copies.
|
|
||||||
|
|
||||||
You may convey verbatim copies of the Program's source code as you
|
|
||||||
receive it, in any medium, provided that you conspicuously and
|
|
||||||
appropriately publish on each copy an appropriate copyright notice;
|
|
||||||
keep intact all notices stating that this License and any
|
|
||||||
non-permissive terms added in accord with section 7 apply to the code;
|
|
||||||
keep intact all notices of the absence of any warranty; and give all
|
|
||||||
recipients a copy of this License along with the Program.
|
|
||||||
|
|
||||||
You may charge any price or no price for each copy that you convey,
|
|
||||||
and you may offer support or warranty protection for a fee.
|
|
||||||
|
|
||||||
5. Conveying Modified Source Versions.
|
|
||||||
|
|
||||||
You may convey a work based on the Program, or the modifications to
|
|
||||||
produce it from the Program, in the form of source code under the
|
|
||||||
terms of section 4, provided that you also meet all of these conditions:
|
|
||||||
|
|
||||||
a) The work must carry prominent notices stating that you modified
|
|
||||||
it, and giving a relevant date.
|
|
||||||
|
|
||||||
b) The work must carry prominent notices stating that it is
|
|
||||||
released under this License and any conditions added under section
|
|
||||||
7. This requirement modifies the requirement in section 4 to
|
|
||||||
"keep intact all notices".
|
|
||||||
|
|
||||||
c) You must license the entire work, as a whole, under this
|
|
||||||
License to anyone who comes into possession of a copy. This
|
|
||||||
License will therefore apply, along with any applicable section 7
|
|
||||||
additional terms, to the whole of the work, and all its parts,
|
|
||||||
regardless of how they are packaged. This License gives no
|
|
||||||
permission to license the work in any other way, but it does not
|
|
||||||
invalidate such permission if you have separately received it.
|
|
||||||
|
|
||||||
d) If the work has interactive user interfaces, each must display
|
|
||||||
Appropriate Legal Notices; however, if the Program has interactive
|
|
||||||
interfaces that do not display Appropriate Legal Notices, your
|
|
||||||
work need not make them do so.
|
|
||||||
|
|
||||||
A compilation of a covered work with other separate and independent
|
|
||||||
works, which are not by their nature extensions of the covered work,
|
|
||||||
and which are not combined with it such as to form a larger program,
|
|
||||||
in or on a volume of a storage or distribution medium, is called an
|
|
||||||
"aggregate" if the compilation and its resulting copyright are not
|
|
||||||
used to limit the access or legal rights of the compilation's users
|
|
||||||
beyond what the individual works permit. Inclusion of a covered work
|
|
||||||
in an aggregate does not cause this License to apply to the other
|
|
||||||
parts of the aggregate.
|
|
||||||
|
|
||||||
6. Conveying Non-Source Forms.
|
|
||||||
|
|
||||||
You may convey a covered work in object code form under the terms
|
|
||||||
of sections 4 and 5, provided that you also convey the
|
|
||||||
machine-readable Corresponding Source under the terms of this License,
|
|
||||||
in one of these ways:
|
|
||||||
|
|
||||||
a) Convey the object code in, or embodied in, a physical product
|
|
||||||
(including a physical distribution medium), accompanied by the
|
|
||||||
Corresponding Source fixed on a durable physical medium
|
|
||||||
customarily used for software interchange.
|
|
||||||
|
|
||||||
b) Convey the object code in, or embodied in, a physical product
|
|
||||||
(including a physical distribution medium), accompanied by a
|
|
||||||
written offer, valid for at least three years and valid for as
|
|
||||||
long as you offer spare parts or customer support for that product
|
|
||||||
model, to give anyone who possesses the object code either (1) a
|
|
||||||
copy of the Corresponding Source for all the software in the
|
|
||||||
product that is covered by this License, on a durable physical
|
|
||||||
medium customarily used for software interchange, for a price no
|
|
||||||
more than your reasonable cost of physically performing this
|
|
||||||
conveying of source, or (2) access to copy the
|
|
||||||
Corresponding Source from a network server at no charge.
|
|
||||||
|
|
||||||
c) Convey individual copies of the object code with a copy of the
|
|
||||||
written offer to provide the Corresponding Source. This
|
|
||||||
alternative is allowed only occasionally and noncommercially, and
|
|
||||||
only if you received the object code with such an offer, in accord
|
|
||||||
with subsection 6b.
|
|
||||||
|
|
||||||
d) Convey the object code by offering access from a designated
|
|
||||||
place (gratis or for a charge), and offer equivalent access to the
|
|
||||||
Corresponding Source in the same way through the same place at no
|
|
||||||
further charge. You need not require recipients to copy the
|
|
||||||
Corresponding Source along with the object code. If the place to
|
|
||||||
copy the object code is a network server, the Corresponding Source
|
|
||||||
may be on a different server (operated by you or a third party)
|
|
||||||
that supports equivalent copying facilities, provided you maintain
|
|
||||||
clear directions next to the object code saying where to find the
|
|
||||||
Corresponding Source. Regardless of what server hosts the
|
|
||||||
Corresponding Source, you remain obligated to ensure that it is
|
|
||||||
available for as long as needed to satisfy these requirements.
|
|
||||||
|
|
||||||
e) Convey the object code using peer-to-peer transmission, provided
|
|
||||||
you inform other peers where the object code and Corresponding
|
|
||||||
Source of the work are being offered to the general public at no
|
|
||||||
charge under subsection 6d.
|
|
||||||
|
|
||||||
A separable portion of the object code, whose source code is excluded
|
|
||||||
from the Corresponding Source as a System Library, need not be
|
|
||||||
included in conveying the object code work.
|
|
||||||
|
|
||||||
A "User Product" is either (1) a "consumer product", which means any
|
|
||||||
tangible personal property which is normally used for personal, family,
|
|
||||||
or household purposes, or (2) anything designed or sold for incorporation
|
|
||||||
into a dwelling. In determining whether a product is a consumer product,
|
|
||||||
doubtful cases shall be resolved in favor of coverage. For a particular
|
|
||||||
product received by a particular user, "normally used" refers to a
|
|
||||||
typical or common use of that class of product, regardless of the status
|
|
||||||
of the particular user or of the way in which the particular user
|
|
||||||
actually uses, or expects or is expected to use, the product. A product
|
|
||||||
is a consumer product regardless of whether the product has substantial
|
|
||||||
commercial, industrial or non-consumer uses, unless such uses represent
|
|
||||||
the only significant mode of use of the product.
|
|
||||||
|
|
||||||
"Installation Information" for a User Product means any methods,
|
|
||||||
procedures, authorization keys, or other information required to install
|
|
||||||
and execute modified versions of a covered work in that User Product from
|
|
||||||
a modified version of its Corresponding Source. The information must
|
|
||||||
suffice to ensure that the continued functioning of the modified object
|
|
||||||
code is in no case prevented or interfered with solely because
|
|
||||||
modification has been made.
|
|
||||||
|
|
||||||
If you convey an object code work under this section in, or with, or
|
|
||||||
specifically for use in, a User Product, and the conveying occurs as
|
|
||||||
part of a transaction in which the right of possession and use of the
|
|
||||||
User Product is transferred to the recipient in perpetuity or for a
|
|
||||||
fixed term (regardless of how the transaction is characterized), the
|
|
||||||
Corresponding Source conveyed under this section must be accompanied
|
|
||||||
by the Installation Information. But this requirement does not apply
|
|
||||||
if neither you nor any third party retains the ability to install
|
|
||||||
modified object code on the User Product (for example, the work has
|
|
||||||
been installed in ROM).
|
|
||||||
|
|
||||||
The requirement to provide Installation Information does not include a
|
|
||||||
requirement to continue to provide support service, warranty, or updates
|
|
||||||
for a work that has been modified or installed by the recipient, or for
|
|
||||||
the User Product in which it has been modified or installed. Access to a
|
|
||||||
network may be denied when the modification itself materially and
|
|
||||||
adversely affects the operation of the network or violates the rules and
|
|
||||||
protocols for communication across the network.
|
|
||||||
|
|
||||||
Corresponding Source conveyed, and Installation Information provided,
|
|
||||||
in accord with this section must be in a format that is publicly
|
|
||||||
documented (and with an implementation available to the public in
|
|
||||||
source code form), and must require no special password or key for
|
|
||||||
unpacking, reading or copying.
|
|
||||||
|
|
||||||
7. Additional Terms.
|
|
||||||
|
|
||||||
"Additional permissions" are terms that supplement the terms of this
|
|
||||||
License by making exceptions from one or more of its conditions.
|
|
||||||
Additional permissions that are applicable to the entire Program shall
|
|
||||||
be treated as though they were included in this License, to the extent
|
|
||||||
that they are valid under applicable law. If additional permissions
|
|
||||||
apply only to part of the Program, that part may be used separately
|
|
||||||
under those permissions, but the entire Program remains governed by
|
|
||||||
this License without regard to the additional permissions.
|
|
||||||
|
|
||||||
When you convey a copy of a covered work, you may at your option
|
|
||||||
remove any additional permissions from that copy, or from any part of
|
|
||||||
it. (Additional permissions may be written to require their own
|
|
||||||
removal in certain cases when you modify the work.) You may place
|
|
||||||
additional permissions on material, added by you to a covered work,
|
|
||||||
for which you have or can give appropriate copyright permission.
|
|
||||||
|
|
||||||
Notwithstanding any other provision of this License, for material you
|
|
||||||
add to a covered work, you may (if authorized by the copyright holders of
|
|
||||||
that material) supplement the terms of this License with terms:
|
|
||||||
|
|
||||||
a) Disclaiming warranty or limiting liability differently from the
|
|
||||||
terms of sections 15 and 16 of this License; or
|
|
||||||
|
|
||||||
b) Requiring preservation of specified reasonable legal notices or
|
|
||||||
author attributions in that material or in the Appropriate Legal
|
|
||||||
Notices displayed by works containing it; or
|
|
||||||
|
|
||||||
c) Prohibiting misrepresentation of the origin of that material, or
|
|
||||||
requiring that modified versions of such material be marked in
|
|
||||||
reasonable ways as different from the original version; or
|
|
||||||
|
|
||||||
d) Limiting the use for publicity purposes of names of licensors or
|
|
||||||
authors of the material; or
|
|
||||||
|
|
||||||
e) Declining to grant rights under trademark law for use of some
|
|
||||||
trade names, trademarks, or service marks; or
|
|
||||||
|
|
||||||
f) Requiring indemnification of licensors and authors of that
|
|
||||||
material by anyone who conveys the material (or modified versions of
|
|
||||||
it) with contractual assumptions of liability to the recipient, for
|
|
||||||
any liability that these contractual assumptions directly impose on
|
|
||||||
those licensors and authors.
|
|
||||||
|
|
||||||
All other non-permissive additional terms are considered "further
|
|
||||||
restrictions" within the meaning of section 10. If the Program as you
|
|
||||||
received it, or any part of it, contains a notice stating that it is
|
|
||||||
governed by this License along with a term that is a further
|
|
||||||
restriction, you may remove that term. If a license document contains
|
|
||||||
a further restriction but permits relicensing or conveying under this
|
|
||||||
License, you may add to a covered work material governed by the terms
|
|
||||||
of that license document, provided that the further restriction does
|
|
||||||
not survive such relicensing or conveying.
|
|
||||||
|
|
||||||
If you add terms to a covered work in accord with this section, you
|
|
||||||
must place, in the relevant source files, a statement of the
|
|
||||||
additional terms that apply to those files, or a notice indicating
|
|
||||||
where to find the applicable terms.
|
|
||||||
|
|
||||||
Additional terms, permissive or non-permissive, may be stated in the
|
|
||||||
form of a separately written license, or stated as exceptions;
|
|
||||||
the above requirements apply either way.
|
|
||||||
|
|
||||||
8. Termination.
|
|
||||||
|
|
||||||
You may not propagate or modify a covered work except as expressly
|
|
||||||
provided under this License. Any attempt otherwise to propagate or
|
|
||||||
modify it is void, and will automatically terminate your rights under
|
|
||||||
this License (including any patent licenses granted under the third
|
|
||||||
paragraph of section 11).
|
|
||||||
|
|
||||||
However, if you cease all violation of this License, then your
|
|
||||||
license from a particular copyright holder is reinstated (a)
|
|
||||||
provisionally, unless and until the copyright holder explicitly and
|
|
||||||
finally terminates your license, and (b) permanently, if the copyright
|
|
||||||
holder fails to notify you of the violation by some reasonable means
|
|
||||||
prior to 60 days after the cessation.
|
|
||||||
|
|
||||||
Moreover, your license from a particular copyright holder is
|
|
||||||
reinstated permanently if the copyright holder notifies you of the
|
|
||||||
violation by some reasonable means, this is the first time you have
|
|
||||||
received notice of violation of this License (for any work) from that
|
|
||||||
copyright holder, and you cure the violation prior to 30 days after
|
|
||||||
your receipt of the notice.
|
|
||||||
|
|
||||||
Termination of your rights under this section does not terminate the
|
|
||||||
licenses of parties who have received copies or rights from you under
|
|
||||||
this License. If your rights have been terminated and not permanently
|
|
||||||
reinstated, you do not qualify to receive new licenses for the same
|
|
||||||
material under section 10.
|
|
||||||
|
|
||||||
9. Acceptance Not Required for Having Copies.
|
|
||||||
|
|
||||||
You are not required to accept this License in order to receive or
|
|
||||||
run a copy of the Program. Ancillary propagation of a covered work
|
|
||||||
occurring solely as a consequence of using peer-to-peer transmission
|
|
||||||
to receive a copy likewise does not require acceptance. However,
|
|
||||||
nothing other than this License grants you permission to propagate or
|
|
||||||
modify any covered work. These actions infringe copyright if you do
|
|
||||||
not accept this License. Therefore, by modifying or propagating a
|
|
||||||
covered work, you indicate your acceptance of this License to do so.
|
|
||||||
|
|
||||||
10. Automatic Licensing of Downstream Recipients.
|
|
||||||
|
|
||||||
Each time you convey a covered work, the recipient automatically
|
|
||||||
receives a license from the original licensors, to run, modify and
|
|
||||||
propagate that work, subject to this License. You are not responsible
|
|
||||||
for enforcing compliance by third parties with this License.
|
|
||||||
|
|
||||||
An "entity transaction" is a transaction transferring control of an
|
|
||||||
organization, or substantially all assets of one, or subdividing an
|
|
||||||
organization, or merging organizations. If propagation of a covered
|
|
||||||
work results from an entity transaction, each party to that
|
|
||||||
transaction who receives a copy of the work also receives whatever
|
|
||||||
licenses to the work the party's predecessor in interest had or could
|
|
||||||
give under the previous paragraph, plus a right to possession of the
|
|
||||||
Corresponding Source of the work from the predecessor in interest, if
|
|
||||||
the predecessor has it or can get it with reasonable efforts.
|
|
||||||
|
|
||||||
You may not impose any further restrictions on the exercise of the
|
|
||||||
rights granted or affirmed under this License. For example, you may
|
|
||||||
not impose a license fee, royalty, or other charge for exercise of
|
|
||||||
rights granted under this License, and you may not initiate litigation
|
|
||||||
(including a cross-claim or counterclaim in a lawsuit) alleging that
|
|
||||||
any patent claim is infringed by making, using, selling, offering for
|
|
||||||
sale, or importing the Program or any portion of it.
|
|
||||||
|
|
||||||
11. Patents.
|
|
||||||
|
|
||||||
A "contributor" is a copyright holder who authorizes use under this
|
|
||||||
License of the Program or a work on which the Program is based. The
|
|
||||||
work thus licensed is called the contributor's "contributor version".
|
|
||||||
|
|
||||||
A contributor's "essential patent claims" are all patent claims
|
|
||||||
owned or controlled by the contributor, whether already acquired or
|
|
||||||
hereafter acquired, that would be infringed by some manner, permitted
|
|
||||||
by this License, of making, using, or selling its contributor version,
|
|
||||||
but do not include claims that would be infringed only as a
|
|
||||||
consequence of further modification of the contributor version. For
|
|
||||||
purposes of this definition, "control" includes the right to grant
|
|
||||||
patent sublicenses in a manner consistent with the requirements of
|
|
||||||
this License.
|
|
||||||
|
|
||||||
Each contributor grants you a non-exclusive, worldwide, royalty-free
|
|
||||||
patent license under the contributor's essential patent claims, to
|
|
||||||
make, use, sell, offer for sale, import and otherwise run, modify and
|
|
||||||
propagate the contents of its contributor version.
|
|
||||||
|
|
||||||
In the following three paragraphs, a "patent license" is any express
|
|
||||||
agreement or commitment, however denominated, not to enforce a patent
|
|
||||||
(such as an express permission to practice a patent or covenant not to
|
|
||||||
sue for patent infringement). To "grant" such a patent license to a
|
|
||||||
party means to make such an agreement or commitment not to enforce a
|
|
||||||
patent against the party.
|
|
||||||
|
|
||||||
If you convey a covered work, knowingly relying on a patent license,
|
|
||||||
and the Corresponding Source of the work is not available for anyone
|
|
||||||
to copy, free of charge and under the terms of this License, through a
|
|
||||||
publicly available network server or other readily accessible means,
|
|
||||||
then you must either (1) cause the Corresponding Source to be so
|
|
||||||
available, or (2) arrange to deprive yourself of the benefit of the
|
|
||||||
patent license for this particular work, or (3) arrange, in a manner
|
|
||||||
consistent with the requirements of this License, to extend the patent
|
|
||||||
license to downstream recipients. "Knowingly relying" means you have
|
|
||||||
actual knowledge that, but for the patent license, your conveying the
|
|
||||||
covered work in a country, or your recipient's use of the covered work
|
|
||||||
in a country, would infringe one or more identifiable patents in that
|
|
||||||
country that you have reason to believe are valid.
|
|
||||||
|
|
||||||
If, pursuant to or in connection with a single transaction or
|
|
||||||
arrangement, you convey, or propagate by procuring conveyance of, a
|
|
||||||
covered work, and grant a patent license to some of the parties
|
|
||||||
receiving the covered work authorizing them to use, propagate, modify
|
|
||||||
or convey a specific copy of the covered work, then the patent license
|
|
||||||
you grant is automatically extended to all recipients of the covered
|
|
||||||
work and works based on it.
|
|
||||||
|
|
||||||
A patent license is "discriminatory" if it does not include within
|
|
||||||
the scope of its coverage, prohibits the exercise of, or is
|
|
||||||
conditioned on the non-exercise of one or more of the rights that are
|
|
||||||
specifically granted under this License. You may not convey a covered
|
|
||||||
work if you are a party to an arrangement with a third party that is
|
|
||||||
in the business of distributing software, under which you make payment
|
|
||||||
to the third party based on the extent of your activity of conveying
|
|
||||||
the work, and under which the third party grants, to any of the
|
|
||||||
parties who would receive the covered work from you, a discriminatory
|
|
||||||
patent license (a) in connection with copies of the covered work
|
|
||||||
conveyed by you (or copies made from those copies), or (b) primarily
|
|
||||||
for and in connection with specific products or compilations that
|
|
||||||
contain the covered work, unless you entered into that arrangement,
|
|
||||||
or that patent license was granted, prior to 28 March 2007.
|
|
||||||
|
|
||||||
Nothing in this License shall be construed as excluding or limiting
|
|
||||||
any implied license or other defenses to infringement that may
|
|
||||||
otherwise be available to you under applicable patent law.
|
|
||||||
|
|
||||||
12. No Surrender of Others' Freedom.
|
|
||||||
|
|
||||||
If conditions are imposed on you (whether by court order, agreement or
|
|
||||||
otherwise) that contradict the conditions of this License, they do not
|
|
||||||
excuse you from the conditions of this License. If you cannot convey a
|
|
||||||
covered work so as to satisfy simultaneously your obligations under this
|
|
||||||
License and any other pertinent obligations, then as a consequence you may
|
|
||||||
not convey it at all. For example, if you agree to terms that obligate you
|
|
||||||
to collect a royalty for further conveying from those to whom you convey
|
|
||||||
the Program, the only way you could satisfy both those terms and this
|
|
||||||
License would be to refrain entirely from conveying the Program.
|
|
||||||
|
|
||||||
13. Use with the GNU Affero General Public License.
|
|
||||||
|
|
||||||
Notwithstanding any other provision of this License, you have
|
|
||||||
permission to link or combine any covered work with a work licensed
|
|
||||||
under version 3 of the GNU Affero General Public License into a single
|
|
||||||
combined work, and to convey the resulting work. The terms of this
|
|
||||||
License will continue to apply to the part which is the covered work,
|
|
||||||
but the special requirements of the GNU Affero General Public License,
|
|
||||||
section 13, concerning interaction through a network will apply to the
|
|
||||||
combination as such.
|
|
||||||
|
|
||||||
14. Revised Versions of this License.
|
|
||||||
|
|
||||||
The Free Software Foundation may publish revised and/or new versions of
|
|
||||||
the GNU General Public License from time to time. Such new versions will
|
|
||||||
be similar in spirit to the present version, but may differ in detail to
|
|
||||||
address new problems or concerns.
|
|
||||||
|
|
||||||
Each version is given a distinguishing version number. If the
|
|
||||||
Program specifies that a certain numbered version of the GNU General
|
|
||||||
Public License "or any later version" applies to it, you have the
|
|
||||||
option of following the terms and conditions either of that numbered
|
|
||||||
version or of any later version published by the Free Software
|
|
||||||
Foundation. If the Program does not specify a version number of the
|
|
||||||
GNU General Public License, you may choose any version ever published
|
|
||||||
by the Free Software Foundation.
|
|
||||||
|
|
||||||
If the Program specifies that a proxy can decide which future
|
|
||||||
versions of the GNU General Public License can be used, that proxy's
|
|
||||||
public statement of acceptance of a version permanently authorizes you
|
|
||||||
to choose that version for the Program.
|
|
||||||
|
|
||||||
Later license versions may give you additional or different
|
|
||||||
permissions. However, no additional obligations are imposed on any
|
|
||||||
author or copyright holder as a result of your choosing to follow a
|
|
||||||
later version.
|
|
||||||
|
|
||||||
15. Disclaimer of Warranty.
|
|
||||||
|
|
||||||
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
|
|
||||||
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
|
|
||||||
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
|
|
||||||
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
|
|
||||||
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
|
|
||||||
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
|
|
||||||
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
|
|
||||||
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
|
|
||||||
|
|
||||||
16. Limitation of Liability.
|
|
||||||
|
|
||||||
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
|
|
||||||
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
|
|
||||||
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
|
|
||||||
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
|
|
||||||
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
|
|
||||||
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
|
|
||||||
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
|
|
||||||
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
|
|
||||||
SUCH DAMAGES.
|
|
||||||
|
|
||||||
17. Interpretation of Sections 15 and 16.
|
|
||||||
|
|
||||||
If the disclaimer of warranty and limitation of liability provided
|
|
||||||
above cannot be given local legal effect according to their terms,
|
|
||||||
reviewing courts shall apply local law that most closely approximates
|
|
||||||
an absolute waiver of all civil liability in connection with the
|
|
||||||
Program, unless a warranty or assumption of liability accompanies a
|
|
||||||
copy of the Program in return for a fee.
|
|
||||||
|
|
||||||
END OF TERMS AND CONDITIONS
|
|
||||||
|
|
||||||
How to Apply These Terms to Your New Programs
|
|
||||||
|
|
||||||
If you develop a new program, and you want it to be of the greatest
|
|
||||||
possible use to the public, the best way to achieve this is to make it
|
|
||||||
free software which everyone can redistribute and change under these terms.
|
|
||||||
|
|
||||||
To do so, attach the following notices to the program. It is safest
|
|
||||||
to attach them to the start of each source file to most effectively
|
|
||||||
state the exclusion of warranty; and each file should have at least
|
|
||||||
the "copyright" line and a pointer to where the full notice is found.
|
|
||||||
|
|
||||||
<one line to give the program's name and a brief idea of what it does.>
|
|
||||||
Copyright (C) <year> <name of author>
|
|
||||||
|
|
||||||
This program is free software: you can redistribute it and/or modify
|
|
||||||
it under the terms of the GNU General Public License as published by
|
|
||||||
the Free Software Foundation, either version 3 of the License, or
|
|
||||||
(at your option) any later version.
|
|
||||||
|
|
||||||
This program is distributed in the hope that it will be useful,
|
|
||||||
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
|
||||||
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
|
||||||
GNU General Public License for more details.
|
|
||||||
|
|
||||||
You should have received a copy of the GNU General Public License
|
|
||||||
along with this program. If not, see <http://www.gnu.org/licenses/>.
|
|
||||||
|
|
||||||
Also add information on how to contact you by electronic and paper mail.
|
|
||||||
|
|
||||||
If the program does terminal interaction, make it output a short
|
|
||||||
notice like this when it starts in an interactive mode:
|
|
||||||
|
|
||||||
<program> Copyright (C) <year> <name of author>
|
|
||||||
This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
|
|
||||||
This is free software, and you are welcome to redistribute it
|
|
||||||
under certain conditions; type `show c' for details.
|
|
||||||
|
|
||||||
The hypothetical commands `show w' and `show c' should show the appropriate
|
|
||||||
parts of the General Public License. Of course, your program's commands
|
|
||||||
might be different; for a GUI interface, you would use an "about box".
|
|
||||||
|
|
||||||
You should also get your employer (if you work as a programmer) or school,
|
|
||||||
if any, to sign a "copyright disclaimer" for the program, if necessary.
|
|
||||||
For more information on this, and how to apply and follow the GNU GPL, see
|
|
||||||
<http://www.gnu.org/licenses/>.
|
|
||||||
|
|
||||||
The GNU General Public License does not permit incorporating your program
|
|
||||||
into proprietary programs. If your program is a subroutine library, you
|
|
||||||
may consider it more useful to permit linking proprietary applications with
|
|
||||||
the library. If this is what you want to do, use the GNU Lesser General
|
|
||||||
Public License instead of this License. But first, please read
|
|
||||||
<http://www.gnu.org/philosophy/why-not-lgpl.html>.
|
|
||||||
@@ -1,51 +0,0 @@
|
|||||||
Spectre & Meltdown Checker
|
|
||||||
==========================
|
|
||||||
|
|
||||||
A simple shell script to tell if your Linux installation is vulnerable
|
|
||||||
against the 3 "speculative execution" CVEs:
|
|
||||||
|
|
||||||
CVE-2017-5753 bounds check bypass (Spectre Variant 1)
|
|
||||||
|
|
||||||
- Impact: Kernel & all software
|
|
||||||
- Mitigation: recompile software *and* kernel with a modified compiler that introduces the LFENCE opcode at the proper positions in the resulting code
|
|
||||||
- Performance impact of the mitigation: negligible
|
|
||||||
|
|
||||||
CVE-2017-5715: branch target injection (Spectre Variant 2)
|
|
||||||
|
|
||||||
- Impact: Kernel
|
|
||||||
- Mitigation 1: new opcode via microcode update that should be used by up to date compilers to protect the BTB (by flushing indirect branch predictors)
|
|
||||||
- Mitigation 2: introducing "retpoline" into compilers, and recompile software/OS with it
|
|
||||||
- Performance impact of the mitigation: high for mitigation 1, medium for mitigation 2, depending on your CPU
|
|
||||||
|
|
||||||
CVE-2017-5754: rogue data cache load (Meltdown)
|
|
||||||
|
|
||||||
- Impact: Kernel
|
|
||||||
- Mitigation: updated kernel (with PTI/KPTI patches), updating the kernel is enough
|
|
||||||
- Performance impact of the mitigation: low to medium
|
|
||||||
|
|
||||||
Example of the output of the script:
|
|
||||||
|
|
||||||
|
|
||||||
```
|
|
||||||
$ sudo ./spectre-meltdown-checker.sh
|
|
||||||
Spectre and Meltdown mitigation detection tool v0.07
|
|
||||||
|
|
||||||
CVE-2017-5753 [bounds check bypass] aka 'Spectre Variant 1'
|
|
||||||
* Kernel compiled with LFENCE opcode inserted at the proper places: NO (only 38 opcodes found, should be >= 60)
|
|
||||||
> STATUS: VULNERABLE
|
|
||||||
|
|
||||||
CVE-2017-5715 [branch target injection] aka 'Spectre Variant 2'
|
|
||||||
* Mitigation 1
|
|
||||||
* Hardware (CPU microcode) support for mitigation: NO
|
|
||||||
* Kernel support for IBRS: NO
|
|
||||||
* IBRS enabled for Kernel space: NO
|
|
||||||
* IBRS enabled for User space: NO
|
|
||||||
* Mitigation 2
|
|
||||||
* Kernel compiled with retpolines: NO
|
|
||||||
> STATUS: VULNERABLE (IBRS hardware + kernel support OR kernel with retpolines are needed to mitigate the vulnerability)
|
|
||||||
|
|
||||||
CVE-2017-5754 [rogue data cache load] aka 'Meltdown' aka 'Variant 3'
|
|
||||||
* Kernel supports Page Table Isolation (PTI): YES
|
|
||||||
* PTI enabled and active: YES
|
|
||||||
> STATUS: NOT VULNERABLE (PTI mitigates the vulnerability)
|
|
||||||
```
|
|
||||||
@@ -0,0 +1,266 @@
|
|||||||
|
# Daily transient-execution vulnerability scan — classification step
|
||||||
|
|
||||||
|
You are a scheduled agent running inside a GitHub Actions job. A preceding
|
||||||
|
workflow step has already fetched all configured sources, applied HTTP
|
||||||
|
conditional caching, deduped against prior state, and written the pre-filtered
|
||||||
|
list of new items to `new_items.json`. Your only job is to classify each item.
|
||||||
|
|
||||||
|
## Scope — read the authoritative docs before classifying
|
||||||
|
|
||||||
|
The project's own docs define what belongs in this tool. **Read them early
|
||||||
|
in the run** (once per run; Claude caches, these don't change daily):
|
||||||
|
|
||||||
|
1. **`./checker/DEVELOPMENT.md`** — "Project Mission" section. What the
|
||||||
|
script does, what it explicitly does not do, its platform scope
|
||||||
|
(Linux + BSD on x86/amd64/ARM/ARM64).
|
||||||
|
2. **`./checker/dist/doc/FAQ.md`** — the section titled
|
||||||
|
_"Which rules are governing the support of a CVE in this tool?"_.
|
||||||
|
This is the **operative test**:
|
||||||
|
> A CVE belongs in scope when mitigating it requires **kernel
|
||||||
|
> modifications, microcode modifications, or both** — and those
|
||||||
|
> modifications are **detectable** by this tool (no hardcoded kernel
|
||||||
|
> versions; look for actual mechanisms).
|
||||||
|
3. **`./checker/dist/doc/UNSUPPORTED_CVE_LIST.md`** — explicit list of
|
||||||
|
CVEs ruled out, grouped by reason:
|
||||||
|
- _Already covered by a parent CVE check_ (e.g. SpectreRSB ⊂ Spectre V2).
|
||||||
|
- _No detectable kernel/microcode mitigation_ (vendor won't fix, GPU
|
||||||
|
driver-only, userspace-only, etc.).
|
||||||
|
- _Not a transient / speculative execution vulnerability at all_.
|
||||||
|
|
||||||
|
Match incoming items against those exclusion patterns. If a CVE is a
|
||||||
|
subvariant of a covered parent, or has no kernel/microcode mitigation
|
||||||
|
this tool can detect, or is simply not a transient-execution issue, it
|
||||||
|
is **unrelated** — not `tocheck`. Out-of-scope items with zero ambiguity
|
||||||
|
should not linger in the `tocheck` backlog.
|
||||||
|
|
||||||
|
In-scope shortlist (for quick reference; the README's CVE table is the
|
||||||
|
authoritative source): Spectre v1/v2/v4, Meltdown, Foreshadow/L1TF,
|
||||||
|
MDS (ZombieLoad/RIDL/Fallout), TAA, SRBDS, iTLB Multihit, MMIO Stale
|
||||||
|
Data, Retbleed, Zenbleed, Downfall (GDS), Inception/SRSO, DIV0, Reptar,
|
||||||
|
RFDS, ITS, TSA-SQ/TSA-L1, VMScape, BPI, FP-DSS — and similar
|
||||||
|
microarchitectural side-channel / speculative-execution issues on
|
||||||
|
Intel / AMD / ARM CPUs with a detectable mitigation.
|
||||||
|
|
||||||
|
Explicitly out of scope: generic software CVEs, GPU driver bugs,
|
||||||
|
networking stacks, filesystem bugs, userspace crypto issues, unrelated
|
||||||
|
kernel subsystems, CPU bugs that the industry has decided not to mitigate
|
||||||
|
(nothing for the tool to check), and CVEs fixed by userspace/SDK updates
|
||||||
|
only.
|
||||||
|
|
||||||
|
## Inputs
|
||||||
|
|
||||||
|
- `new_items.json` — shape:
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"scan_date": "2026-04-18T14:24:43+00:00",
|
||||||
|
"window_cutoff": "2026-04-17T13:24:43+00:00",
|
||||||
|
"per_source": { "phoronix": {"status": 200, "new": 2, "total_in_feed": 75} },
|
||||||
|
"items": [
|
||||||
|
{
|
||||||
|
"source": "phoronix",
|
||||||
|
"stable_id": "CVE-2026-1234",
|
||||||
|
"title": "...",
|
||||||
|
"permalink": "https://...",
|
||||||
|
"guid": "...",
|
||||||
|
"published_at": "2026-04-18T05:00:00+00:00",
|
||||||
|
"extracted_cves": ["CVE-2026-1234"],
|
||||||
|
"vendor_ids": [],
|
||||||
|
"snippet": "first 400 chars of description, tags stripped"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"reconsider": [
|
||||||
|
{
|
||||||
|
"canonical_id": "INTEL-SA-00145",
|
||||||
|
"current_bucket": "toimplement",
|
||||||
|
"title": "Lazy FP State Restore",
|
||||||
|
"sources": ["intel-psirt"],
|
||||||
|
"urls": ["https://www.intel.com/.../intel-sa-00145.html"],
|
||||||
|
"extracted_cves": [],
|
||||||
|
"first_seen": "2026-04-19T09:41:44+00:00"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
- `items` are fresh observations from today's fetch: already inside the
|
||||||
|
time window and not yet present in state under any alt-ID.
|
||||||
|
- `reconsider` holds existing `toimplement`/`tocheck` entries from state,
|
||||||
|
submitted for re-review each run (see the "Reconsideration" section
|
||||||
|
below). On days where both arrays are empty, write stub output files
|
||||||
|
with `(no new items in this window)`.
|
||||||
|
|
||||||
|
- `./checker/` is a checkout of the **`test`** branch of this repo (the
|
||||||
|
development branch where coded-but-unreleased CVE checks live). This is
|
||||||
|
the source of truth for whether a CVE is already covered. Grep this
|
||||||
|
directory — not the working directory root, which only holds the
|
||||||
|
vuln-watch scripts and has no checker code.
|
||||||
|
|
||||||
|
## Classification rules
|
||||||
|
|
||||||
|
For each item in `items`, pick exactly one bucket:
|
||||||
|
|
||||||
|
- **toimplement** — clearly in-scope per the FAQ test (kernel/microcode
|
||||||
|
mitigation exists AND is detectable by this tool), and **not already
|
||||||
|
covered** by `./checker/`. Verify the second half: grep `./checker/`
|
||||||
|
for each `extracted_cves` entry *and* for any codename in the title
|
||||||
|
(e.g., "FP-DSS", "Inception"). If either matches, the right bucket is
|
||||||
|
`unrelated` (already covered) or `tocheck` (maintainer should confirm
|
||||||
|
whether an existing check handles the new variant).
|
||||||
|
- **tocheck** — there is a **specific question a maintainer must answer**
|
||||||
|
before this can be filed anywhere else. Examples:
|
||||||
|
- Ambiguity about whether an existing check (e.g. parent Spectre V2)
|
||||||
|
transitively covers this new sub-variant, or whether a fresh entry
|
||||||
|
is warranted.
|
||||||
|
- Embedded-only ARM SKU and it's unclear if the tool's ARM support
|
||||||
|
reaches that class of SKU.
|
||||||
|
- Vendor advisory published without a CVE ID yet, but the vuln looks
|
||||||
|
in-scope; revisit once the CVE is assigned.
|
||||||
|
- Contradictory statements across sources about whether a mitigation
|
||||||
|
is detectable (kernel-patch vs. userspace-only vs. microcode).
|
||||||
|
|
||||||
|
**Do NOT use `tocheck` as a catch-all** for "I'm not sure". Most items
|
||||||
|
have a clear answer once you consult UNSUPPORTED_CVE_LIST.md and the
|
||||||
|
FAQ rule. If you can articulate the specific question a maintainer
|
||||||
|
needs to answer — `tocheck`. If the only reason is "maybe?" — it's
|
||||||
|
`unrelated`.
|
||||||
|
|
||||||
|
- **unrelated** — everything else. Including:
|
||||||
|
- Matches a pattern in UNSUPPORTED_CVE_LIST.md (subvariant of covered
|
||||||
|
parent, no detectable mitigation, not transient-execution).
|
||||||
|
- Fails the FAQ rule (userspace-only fix, driver update, industry
|
||||||
|
decided not to mitigate).
|
||||||
|
- Non-CPU security topic (kernel filesystem bug, network stack, crypto
|
||||||
|
library, GPU driver, compiler flag change, distro release notes).
|
||||||
|
|
||||||
|
**Tie-breakers** (note the direction — this used to bias the other way):
|
||||||
|
- Prefer `unrelated` over `tocheck` when the item matches a category in
|
||||||
|
UNSUPPORTED_CVE_LIST.md or plainly fails the FAQ rule. Growing the
|
||||||
|
`tocheck` backlog with obvious-unrelateds wastes human time more than
|
||||||
|
a confident `unrelated` does.
|
||||||
|
- Prefer `tocheck` over `toimplement` when the CVE is still "reserved" /
|
||||||
|
"pending" — false positives in `toimplement` create phantom work.
|
||||||
|
|
||||||
|
`WebFetch` is available for resolving genuine `tocheck` ambiguity.
|
||||||
|
Budget: **3 follow-ups per run total**. Do not use it for items you
|
||||||
|
already plan to file as `unrelated` or `toimplement`.
|
||||||
|
|
||||||
|
## Reconsideration rules (for `reconsider` entries)
|
||||||
|
|
||||||
|
Each `reconsider` entry is an item *already* in state under `current_bucket`
|
||||||
|
= `toimplement` or `tocheck`, from a prior run. Re-examine it against the
|
||||||
|
**current** `./checker/` tree and the scope docs above. This pass is the
|
||||||
|
right place to prune the `tocheck` backlog: prior runs (before these
|
||||||
|
scope docs were wired in) may have hedged on items that now have a clear
|
||||||
|
`unrelated` answer — demote them aggressively. You may:
|
||||||
|
|
||||||
|
- **Demote** `toimplement` → `tocheck` or `unrelated` if the checker now
|
||||||
|
covers the CVE/codename (grep confirms), or if reinterpreting the
|
||||||
|
advisory shows it's out of scope.
|
||||||
|
- **Demote** `tocheck` → `unrelated` if new context settles the ambiguity
|
||||||
|
as out-of-scope.
|
||||||
|
- **Promote** `tocheck` → `toimplement` if you now have firm evidence it's
|
||||||
|
a real, in-scope, not-yet-covered CVE.
|
||||||
|
- **Leave it unchanged** (same bucket) — emit a record anyway; it's cheap
|
||||||
|
and documents that the reconsideration happened today.
|
||||||
|
- **Reassign the canonical ID** — if a CVE has since been assigned to a
|
||||||
|
vendor advisory (e.g., an INTEL-SA that previously had no CVE), put the
|
||||||
|
CVE in `extracted_cves` and use it as the new `canonical_id`. The merge
|
||||||
|
step will rekey the record under the CVE and keep the old ID as an alias.
|
||||||
|
|
||||||
|
For every reconsider record you emit, set `"reconsider": true` in its
|
||||||
|
classification entry — this tells the merge step to **overwrite** the
|
||||||
|
stored bucket (including demotions), not just promote.
|
||||||
|
|
||||||
|
## Outputs
|
||||||
|
|
||||||
|
Compute `TODAY` = the `YYYY-MM-DD` prefix of `scan_date`. Write three files at
|
||||||
|
the repo root, overwriting if present:
|
||||||
|
|
||||||
|
- `watch_${TODAY}_toimplement.md`
|
||||||
|
- `watch_${TODAY}_tocheck.md`
|
||||||
|
- `watch_${TODAY}_unrelated.md`
|
||||||
|
|
||||||
|
These delta files cover the **`items`** array only — they answer "what
|
||||||
|
did today's fetch surface". Reconsider decisions update state (and surface
|
||||||
|
in the `current_*.md` snapshots the merge step rewrites); don't duplicate
|
||||||
|
them here.
|
||||||
|
|
||||||
|
Each file uses level-2 headers per source short-name, then one bullet per
|
||||||
|
item: the stable ID, the permalink, and 1–2 sentences of context.
|
||||||
|
|
||||||
|
```markdown
|
||||||
|
## oss-sec
|
||||||
|
- **CVE-2026-1234** — https://www.openwall.com/lists/oss-security/2026/04/18/3
|
||||||
|
New Intel transient-execution bug "Foo"; affects Redwood Cove cores.
|
||||||
|
Not yet covered (grepped CVE-2026-1234 and "Foo" — no matches).
|
||||||
|
```
|
||||||
|
|
||||||
|
If a bucket has no items, write `(no new items in this window)`.
|
||||||
|
|
||||||
|
Append the following block to the **tocheck** file (creating it if
|
||||||
|
otherwise empty):
|
||||||
|
|
||||||
|
```markdown
|
||||||
|
## Run summary
|
||||||
|
- scan_date: <value>
|
||||||
|
- per-source counts (from per_source): ...
|
||||||
|
- fetch failures (status != 200/304): ...
|
||||||
|
- total classified this run: toimplement=<n>, tocheck=<n>, unrelated=<n>
|
||||||
|
- reconsidered: <n> entries re-reviewed; <list any bucket transitions, e.g.
|
||||||
|
"CVE-2018-3665: toimplement -> tocheck (now covered at src/vulns/...)">,
|
||||||
|
or "no transitions" if every reconsider kept its existing bucket.
|
||||||
|
```
|
||||||
|
|
||||||
|
## `classifications.json` — required side-channel for the merge step
|
||||||
|
|
||||||
|
Also write `classifications.json` at the repo root. It is a JSON array, one
|
||||||
|
record per item in `new_items.json.items`:
|
||||||
|
|
||||||
|
```json
|
||||||
|
[
|
||||||
|
{
|
||||||
|
"stable_id": "CVE-2026-1234",
|
||||||
|
"canonical_id": "CVE-2026-1234",
|
||||||
|
"bucket": "toimplement",
|
||||||
|
"extracted_cves": ["CVE-2026-1234"],
|
||||||
|
"sources": ["phoronix"],
|
||||||
|
"urls": ["https://www.phoronix.com/news/..."]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
```
|
||||||
|
|
||||||
|
Rules:
|
||||||
|
|
||||||
|
- One record per input item (`items` + `reconsider`). For items, use the
|
||||||
|
same `stable_id` as in `new_items.json`. For reconsider entries, use the
|
||||||
|
entry's `canonical_id` from state as the record's `stable_id`.
|
||||||
|
- `canonical_id`: prefer the first `extracted_cves` entry if any; otherwise
|
||||||
|
the item's `stable_id`. **Use the same `canonical_id` for multiple items
|
||||||
|
that are really the same CVE from different sources** — the merge step
|
||||||
|
will collapse them into one entry and add alias rows automatically.
|
||||||
|
- **Populate `extracted_cves` / `canonical_id` from context when the feed
|
||||||
|
didn't.** If the title, body, or a well-known transient-execution codename
|
||||||
|
mapping lets you identify a CVE the feed didn't emit (e.g., "Lazy FP
|
||||||
|
State Restore" → `CVE-2018-3665`, "LazyFP" → same, "FP-DSS" → whatever
|
||||||
|
CVE AMD/Intel assigned), put the CVE in `extracted_cves` and use it as
|
||||||
|
`canonical_id`. This prevents Intel's CVE-less listing entries from
|
||||||
|
creating orphan `INTEL-SA-NNNNN` records in the backlog.
|
||||||
|
- `sources` / `urls`: arrays; default to the item's own single source and
|
||||||
|
permalink if you didn't enrich further.
|
||||||
|
- **`reconsider: true`** — set on every record that corresponds to an
|
||||||
|
input from the `reconsider` array. The merge step uses this flag to
|
||||||
|
overwrite the stored bucket instead of merging by "strongest wins" —
|
||||||
|
this is what enables demotions.
|
||||||
|
- If both `items` and `reconsider` are empty, write `[]`.
|
||||||
|
|
||||||
|
## Guardrails
|
||||||
|
|
||||||
|
- Do NOT modify any repo source code. Only write the four output files.
|
||||||
|
- Do NOT create commits, branches, or PRs.
|
||||||
|
- Do NOT call tools that post externally (Slack, GitHub comments, issues, …).
|
||||||
|
- Do NOT re-fetch the RSS/HTML sources — that was the prior step's job.
|
||||||
|
`WebFetch` is only for drilling into a specific advisory/article URL to
|
||||||
|
resolve a `tocheck` ambiguity (budget 3).
|
||||||
|
- If total runtime exceeds 10 minutes, finish what you have, write partial
|
||||||
|
outputs (+ a note in the tocheck run summary), and exit cleanly.
|
||||||
@@ -0,0 +1,570 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""Fetch all configured sources, dedup against state/seen.json, emit new_items.json.
|
||||||
|
|
||||||
|
Writes updated per-source HTTP cache metadata (etag, last_modified, hwm_*) back
|
||||||
|
into state/seen.json. Does NOT touch state.seen / state.aliases — that is the
|
||||||
|
merge step's job, after Claude has classified the new items.
|
||||||
|
|
||||||
|
Usage:
|
||||||
|
SCAN_DATE=2026-04-18T14:24:43Z python -m scripts.vuln_watch.fetch_and_diff
|
||||||
|
"""
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
import datetime
|
||||||
|
import gzip
|
||||||
|
import json
|
||||||
|
import os
|
||||||
|
import pathlib
|
||||||
|
import re
|
||||||
|
import sys
|
||||||
|
import urllib.error
|
||||||
|
import urllib.parse
|
||||||
|
import urllib.request
|
||||||
|
from typing import Any, Iterable
|
||||||
|
|
||||||
|
import feedparser # type: ignore[import-untyped]
|
||||||
|
|
||||||
|
from .sources import REQUEST_TIMEOUT, SOURCES, Source, USER_AGENT
|
||||||
|
from . import state
|
||||||
|
|
||||||
|
|
||||||
|
CVE_RE = re.compile(r"CVE-\d{4}-\d{4,7}")
|
||||||
|
DEFAULT_WINDOW_HOURS = 25
|
||||||
|
DEFAULT_RECONSIDER_AGE_DAYS = 7
|
||||||
|
MAX_ITEMS_PER_FEED = 200
|
||||||
|
SNIPPET_MAX = 400
|
||||||
|
NEW_ITEMS_PATH = pathlib.Path("new_items.json")
|
||||||
|
|
||||||
|
|
||||||
|
def parse_iso(ts: str | None) -> datetime.datetime | None:
|
||||||
|
if not ts:
|
||||||
|
return None
|
||||||
|
try:
|
||||||
|
return datetime.datetime.fromisoformat(ts.replace("Z", "+00:00"))
|
||||||
|
except ValueError:
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def now_from_scan_date(scan_date: str) -> datetime.datetime:
|
||||||
|
dt = parse_iso(scan_date)
|
||||||
|
if dt is None:
|
||||||
|
dt = datetime.datetime.now(datetime.timezone.utc)
|
||||||
|
return dt
|
||||||
|
|
||||||
|
|
||||||
|
def conditional_get(
|
||||||
|
url: str,
|
||||||
|
etag: str | None,
|
||||||
|
last_modified: str | None,
|
||||||
|
user_agent: str = USER_AGENT,
|
||||||
|
) -> tuple[int | str, bytes | None, str | None, str | None]:
|
||||||
|
"""Perform a conditional GET.
|
||||||
|
|
||||||
|
Returns (status, body, new_etag, new_last_modified).
|
||||||
|
|
||||||
|
status is:
|
||||||
|
- 200 with body on success
|
||||||
|
- 304 with body=None when unchanged
|
||||||
|
- an int HTTP error code on server-side errors
|
||||||
|
- a string describing a network/transport failure
|
||||||
|
"""
|
||||||
|
req = urllib.request.Request(url, headers={
|
||||||
|
"User-Agent": user_agent,
|
||||||
|
# AMD's CDN stalls on non-gzip clients; asking for gzip speeds up
|
||||||
|
# every source and is strictly beneficial (we decompress locally).
|
||||||
|
"Accept-Encoding": "gzip",
|
||||||
|
})
|
||||||
|
if etag:
|
||||||
|
req.add_header("If-None-Match", etag)
|
||||||
|
if last_modified:
|
||||||
|
req.add_header("If-Modified-Since", last_modified)
|
||||||
|
try:
|
||||||
|
with urllib.request.urlopen(req, timeout=REQUEST_TIMEOUT) as resp:
|
||||||
|
body = resp.read()
|
||||||
|
if resp.headers.get("Content-Encoding", "").lower() == "gzip":
|
||||||
|
try:
|
||||||
|
body = gzip.decompress(body)
|
||||||
|
except OSError:
|
||||||
|
pass # server lied about encoding; use as-is
|
||||||
|
return (
|
||||||
|
resp.status,
|
||||||
|
body,
|
||||||
|
resp.headers.get("ETag"),
|
||||||
|
resp.headers.get("Last-Modified"),
|
||||||
|
)
|
||||||
|
except urllib.error.HTTPError as e:
|
||||||
|
if e.code == 304:
|
||||||
|
return (304, None, etag, last_modified)
|
||||||
|
return (e.code, None, etag, last_modified)
|
||||||
|
except (urllib.error.URLError, TimeoutError, OSError) as e:
|
||||||
|
return (f"network:{type(e).__name__}", None, etag, last_modified)
|
||||||
|
|
||||||
|
|
||||||
|
def extract_cves(text: str) -> list[str]:
|
||||||
|
seen: set[str] = set()
|
||||||
|
out: list[str] = []
|
||||||
|
for m in CVE_RE.findall(text or ""):
|
||||||
|
if m not in seen:
|
||||||
|
seen.add(m)
|
||||||
|
out.append(m)
|
||||||
|
return out
|
||||||
|
|
||||||
|
|
||||||
|
def extract_vendor_ids(text: str, patterns: Iterable[str]) -> list[str]:
|
||||||
|
seen: set[str] = set()
|
||||||
|
out: list[str] = []
|
||||||
|
for p in patterns:
|
||||||
|
for m in re.findall(p, text or ""):
|
||||||
|
if m not in seen:
|
||||||
|
seen.add(m)
|
||||||
|
out.append(m)
|
||||||
|
return out
|
||||||
|
|
||||||
|
|
||||||
|
def pick_stable_id(vendor_ids: list[str], cves: list[str], guid: str, link: str) -> str:
|
||||||
|
"""Pick canonical-ish stable ID: vendor advisory → CVE → guid → permalink.
|
||||||
|
|
||||||
|
CVE is preferred over guid/URL so that the same CVE seen via different
|
||||||
|
feeds collapses on its stable_id alone (in addition to the alias map).
|
||||||
|
"""
|
||||||
|
if vendor_ids:
|
||||||
|
return vendor_ids[0]
|
||||||
|
if cves:
|
||||||
|
return cves[0]
|
||||||
|
if guid:
|
||||||
|
return guid
|
||||||
|
return link
|
||||||
|
|
||||||
|
|
||||||
|
def clean_snippet(s: str) -> str:
|
||||||
|
s = re.sub(r"<[^>]+>", " ", s or "")
|
||||||
|
s = re.sub(r"\s+", " ", s)
|
||||||
|
return s.strip()
|
||||||
|
|
||||||
|
|
||||||
|
def _struct_time_to_iso(st: Any) -> str | None:
|
||||||
|
if not st:
|
||||||
|
return None
|
||||||
|
try:
|
||||||
|
return datetime.datetime(*st[:6], tzinfo=datetime.timezone.utc).isoformat()
|
||||||
|
except (TypeError, ValueError):
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def parse_feed_body(src: Source, body: bytes) -> list[dict[str, Any]]:
|
||||||
|
parsed = feedparser.parse(body)
|
||||||
|
items: list[dict[str, Any]] = []
|
||||||
|
for entry in parsed.entries[:MAX_ITEMS_PER_FEED]:
|
||||||
|
link = (entry.get("link") or "").strip()
|
||||||
|
guid = (entry.get("id") or entry.get("guid") or "").strip()
|
||||||
|
title = (entry.get("title") or "").strip()
|
||||||
|
summary = entry.get("summary") or ""
|
||||||
|
published_at = (
|
||||||
|
_struct_time_to_iso(entry.get("published_parsed"))
|
||||||
|
or _struct_time_to_iso(entry.get("updated_parsed"))
|
||||||
|
)
|
||||||
|
blob = f"{title}\n{summary}"
|
||||||
|
cves = extract_cves(blob)
|
||||||
|
vendor_ids = extract_vendor_ids(blob, src.advisory_id_patterns)
|
||||||
|
stable_id = pick_stable_id(vendor_ids, cves, guid, link)
|
||||||
|
items.append({
|
||||||
|
"source": src.name,
|
||||||
|
"stable_id": stable_id,
|
||||||
|
"title": title,
|
||||||
|
"permalink": link,
|
||||||
|
"guid": guid,
|
||||||
|
"published_at": published_at,
|
||||||
|
"extracted_cves": cves,
|
||||||
|
"vendor_ids": vendor_ids,
|
||||||
|
"snippet": clean_snippet(summary)[:SNIPPET_MAX],
|
||||||
|
})
|
||||||
|
return items
|
||||||
|
|
||||||
|
|
||||||
|
def _parse_intel_psirt(src: Source, text: str) -> list[dict[str, Any]]:
|
||||||
|
"""Intel's security-center page uses a table of <tr class="data"> rows:
|
||||||
|
|
||||||
|
<tr class="data" ...>
|
||||||
|
<td ...><a href="/.../intel-sa-NNNNN.html">Title</a></td>
|
||||||
|
<td>INTEL-SA-NNNNN</td>
|
||||||
|
<td>March 10, 2026</td> <- Last updated
|
||||||
|
<td>March 10, 2026</td> <- First published
|
||||||
|
</tr>
|
||||||
|
|
||||||
|
We pick the later of the two dates as `published_at` (most recent
|
||||||
|
activity) so updates to older advisories also show up in the window.
|
||||||
|
"""
|
||||||
|
items: list[dict[str, Any]] = []
|
||||||
|
seen_ids: set[str] = set()
|
||||||
|
permalink_base = src.display_url or src.url
|
||||||
|
for m in re.finditer(r'<tr class="data"[^>]*>(.*?)</tr>', text, re.DOTALL):
|
||||||
|
row = m.group(1)
|
||||||
|
sid = re.search(r'INTEL-SA-\d+', row)
|
||||||
|
if not sid:
|
||||||
|
continue
|
||||||
|
advisory_id = sid.group(0)
|
||||||
|
if advisory_id in seen_ids:
|
||||||
|
continue
|
||||||
|
seen_ids.add(advisory_id)
|
||||||
|
link_m = re.search(r'href="([^"#]+)"', row)
|
||||||
|
permalink = urllib.parse.urljoin(permalink_base, link_m.group(1)) if link_m else permalink_base
|
||||||
|
title_m = re.search(r'<a[^>]*>([^<]+)</a>', row)
|
||||||
|
title = title_m.group(1).strip() if title_m else advisory_id
|
||||||
|
published_at: str | None = None
|
||||||
|
for ds in re.findall(r'<td[^>]*>\s*([A-Z][a-z]+ \d{1,2}, \d{4})\s*</td>', row):
|
||||||
|
try:
|
||||||
|
dt = datetime.datetime.strptime(ds, "%B %d, %Y").replace(tzinfo=datetime.timezone.utc)
|
||||||
|
iso = dt.isoformat()
|
||||||
|
if published_at is None or iso > published_at:
|
||||||
|
published_at = iso
|
||||||
|
except ValueError:
|
||||||
|
continue
|
||||||
|
items.append({
|
||||||
|
"source": src.name,
|
||||||
|
"stable_id": advisory_id,
|
||||||
|
"title": title,
|
||||||
|
"permalink": permalink,
|
||||||
|
"guid": "",
|
||||||
|
"published_at": published_at,
|
||||||
|
"extracted_cves": extract_cves(row),
|
||||||
|
"vendor_ids": [advisory_id],
|
||||||
|
"snippet": clean_snippet(row)[:SNIPPET_MAX],
|
||||||
|
})
|
||||||
|
return items[:MAX_ITEMS_PER_FEED]
|
||||||
|
|
||||||
|
|
||||||
|
def _parse_amd_psirt(src: Source, text: str) -> list[dict[str, Any]]:
|
||||||
|
"""AMD's product-security page has a bulletin table where each row ends
|
||||||
|
with two `<td data-sort="YYYY-MM-DD HHMMSS">` cells (Published Date,
|
||||||
|
Last Updated Date). The machine-readable `data-sort` attribute is far
|
||||||
|
easier to parse than the human-readable text alongside it.
|
||||||
|
"""
|
||||||
|
items: list[dict[str, Any]] = []
|
||||||
|
seen_ids: set[str] = set()
|
||||||
|
permalink_base = src.display_url or src.url
|
||||||
|
for m in re.finditer(r'<tr[^>]*>(.*?AMD-SB-\d+.*?)</tr>', text, re.DOTALL):
|
||||||
|
row = m.group(1)
|
||||||
|
sid = re.search(r'AMD-SB-\d+', row)
|
||||||
|
if not sid:
|
||||||
|
continue
|
||||||
|
advisory_id = sid.group(0)
|
||||||
|
if advisory_id in seen_ids:
|
||||||
|
continue
|
||||||
|
seen_ids.add(advisory_id)
|
||||||
|
link_m = re.search(r'href="([^"#]+)"', row)
|
||||||
|
permalink = urllib.parse.urljoin(permalink_base, link_m.group(1)) if link_m else permalink_base
|
||||||
|
title_m = re.search(r'<a[^>]*>([^<]+)</a>', row)
|
||||||
|
title = title_m.group(1).strip() if title_m else advisory_id
|
||||||
|
published_at: str | None = None
|
||||||
|
for (y, mo, d, h, mi, s) in re.findall(
|
||||||
|
r'data-sort="(\d{4})-(\d{2})-(\d{2})\s+(\d{2})(\d{2})(\d{2})"', row
|
||||||
|
):
|
||||||
|
iso = f"{y}-{mo}-{d}T{h}:{mi}:{s}+00:00"
|
||||||
|
if published_at is None or iso > published_at:
|
||||||
|
published_at = iso
|
||||||
|
items.append({
|
||||||
|
"source": src.name,
|
||||||
|
"stable_id": advisory_id,
|
||||||
|
"title": title,
|
||||||
|
"permalink": permalink,
|
||||||
|
"guid": "",
|
||||||
|
"published_at": published_at,
|
||||||
|
"extracted_cves": extract_cves(row),
|
||||||
|
"vendor_ids": [advisory_id],
|
||||||
|
"snippet": clean_snippet(row)[:SNIPPET_MAX],
|
||||||
|
})
|
||||||
|
return items[:MAX_ITEMS_PER_FEED]
|
||||||
|
|
||||||
|
|
||||||
|
def _parse_html_generic(src: Source, text: str) -> list[dict[str, Any]]:
|
||||||
|
"""Fallback regex-only extractor for HTML sources with no known table
|
||||||
|
layout (arm-spec, transient-fail's tree.js). Emits `published_at=None`
|
||||||
|
— items pass the window filter as fail-safe, but state.seen dedup
|
||||||
|
prevents re-emission across runs."""
|
||||||
|
items: list[dict[str, Any]] = []
|
||||||
|
seen_ids: set[str] = set()
|
||||||
|
permalink_base = src.display_url or src.url
|
||||||
|
for pat in src.advisory_id_patterns:
|
||||||
|
for m in re.finditer(pat, text):
|
||||||
|
advisory_id = m.group(0)
|
||||||
|
if advisory_id in seen_ids:
|
||||||
|
continue
|
||||||
|
seen_ids.add(advisory_id)
|
||||||
|
window = text[max(0, m.start() - 400): m.end() + 400]
|
||||||
|
href_match = re.search(r'href="([^"#]+)"', window)
|
||||||
|
if href_match:
|
||||||
|
permalink = urllib.parse.urljoin(permalink_base, href_match.group(1))
|
||||||
|
else:
|
||||||
|
permalink = permalink_base
|
||||||
|
cves_in_window = extract_cves(window)
|
||||||
|
is_cve = advisory_id.startswith("CVE-")
|
||||||
|
cves = cves_in_window if not is_cve else list({advisory_id, *cves_in_window})
|
||||||
|
vendor_ids = [] if is_cve else [advisory_id]
|
||||||
|
items.append({
|
||||||
|
"source": src.name,
|
||||||
|
"stable_id": advisory_id,
|
||||||
|
"title": advisory_id,
|
||||||
|
"permalink": permalink,
|
||||||
|
"guid": "",
|
||||||
|
"published_at": None,
|
||||||
|
"extracted_cves": cves,
|
||||||
|
"vendor_ids": vendor_ids,
|
||||||
|
"snippet": clean_snippet(window)[:SNIPPET_MAX],
|
||||||
|
})
|
||||||
|
return items[:MAX_ITEMS_PER_FEED]
|
||||||
|
|
||||||
|
|
||||||
|
_HTML_PARSERS = {
|
||||||
|
"intel-psirt": _parse_intel_psirt,
|
||||||
|
"amd-psirt": _parse_amd_psirt,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def parse_html_body(src: Source, body: bytes) -> list[dict[str, Any]]:
|
||||||
|
"""Dispatch to a per-source HTML parser when one is registered;
|
||||||
|
fall back to the generic regex-over-advisory-IDs extractor."""
|
||||||
|
text = body.decode("utf-8", errors="replace")
|
||||||
|
parser = _HTML_PARSERS.get(src.name, _parse_html_generic)
|
||||||
|
return parser(src, text)
|
||||||
|
|
||||||
|
|
||||||
|
def parse_body(src: Source, body: bytes) -> list[dict[str, Any]]:
|
||||||
|
return parse_feed_body(src, body) if src.kind in ("rss", "atom") else parse_html_body(src, body)
|
||||||
|
|
||||||
|
|
||||||
|
def compute_cutoff(
|
||||||
|
scan_now: datetime.datetime,
|
||||||
|
last_run: str | None,
|
||||||
|
window_hours: float = DEFAULT_WINDOW_HOURS,
|
||||||
|
) -> datetime.datetime:
|
||||||
|
base = scan_now - datetime.timedelta(hours=window_hours)
|
||||||
|
lr = parse_iso(last_run)
|
||||||
|
if lr is None:
|
||||||
|
return base
|
||||||
|
widened = scan_now - (scan_now - lr + datetime.timedelta(hours=1))
|
||||||
|
return min(base, widened)
|
||||||
|
|
||||||
|
|
||||||
|
def _resolve_window_hours() -> float:
|
||||||
|
"""Pick up WINDOW_HOURS from the environment (set by workflow_dispatch).
|
||||||
|
Falls back to DEFAULT_WINDOW_HOURS for cron runs or local invocations."""
|
||||||
|
raw = os.environ.get("WINDOW_HOURS", "").strip()
|
||||||
|
if not raw:
|
||||||
|
return float(DEFAULT_WINDOW_HOURS)
|
||||||
|
try:
|
||||||
|
v = float(raw)
|
||||||
|
if v <= 0:
|
||||||
|
raise ValueError("must be > 0")
|
||||||
|
return v
|
||||||
|
except ValueError:
|
||||||
|
print(f"warning: ignoring invalid WINDOW_HOURS={raw!r}, using {DEFAULT_WINDOW_HOURS}",
|
||||||
|
file=sys.stderr)
|
||||||
|
return float(DEFAULT_WINDOW_HOURS)
|
||||||
|
|
||||||
|
|
||||||
|
def _resolve_reconsider_age_days() -> float:
|
||||||
|
"""Pick up RECONSIDER_AGE_DAYS from the environment. Entries whose last
|
||||||
|
review (reconsidered_at, or first_seen if never reconsidered) is more
|
||||||
|
recent than this many days ago are skipped. 0 = reconsider everything
|
||||||
|
every run (no throttle)."""
|
||||||
|
raw = os.environ.get("RECONSIDER_AGE_DAYS", "").strip()
|
||||||
|
if not raw:
|
||||||
|
return float(DEFAULT_RECONSIDER_AGE_DAYS)
|
||||||
|
try:
|
||||||
|
v = float(raw)
|
||||||
|
if v < 0:
|
||||||
|
raise ValueError("must be >= 0")
|
||||||
|
return v
|
||||||
|
except ValueError:
|
||||||
|
print(f"warning: ignoring invalid RECONSIDER_AGE_DAYS={raw!r}, "
|
||||||
|
f"using {DEFAULT_RECONSIDER_AGE_DAYS}", file=sys.stderr)
|
||||||
|
return float(DEFAULT_RECONSIDER_AGE_DAYS)
|
||||||
|
|
||||||
|
|
||||||
|
def backlog_to_reconsider(
|
||||||
|
data: dict[str, Any],
|
||||||
|
scan_now: datetime.datetime,
|
||||||
|
min_age_days: float = DEFAULT_RECONSIDER_AGE_DAYS,
|
||||||
|
) -> list[dict[str, Any]]:
|
||||||
|
"""Walk state.seen and emit toimplement/tocheck entries for re-review.
|
||||||
|
|
||||||
|
Throttle: skip entries whose "last review" timestamp is more recent
|
||||||
|
than `min_age_days` ago. "Last review" is `reconsidered_at` if Claude
|
||||||
|
has already reconsidered the entry at least once, otherwise
|
||||||
|
`first_seen` (the initial classification was itself a review). With
|
||||||
|
`min_age_days=0` the throttle is disabled — every qualifying entry
|
||||||
|
is emitted on every run.
|
||||||
|
|
||||||
|
Items in `unrelated` are never emitted — those are settled.
|
||||||
|
A CVE alias pointing at this canonical is included in `extracted_cves`
|
||||||
|
so Claude sees every known CVE for the item without having to consult
|
||||||
|
the full alias map.
|
||||||
|
"""
|
||||||
|
seen = data.get("seen", {})
|
||||||
|
aliases = data.get("aliases", {})
|
||||||
|
by_canonical: dict[str, list[str]] = {}
|
||||||
|
for alt, canon in aliases.items():
|
||||||
|
by_canonical.setdefault(canon, []).append(alt)
|
||||||
|
|
||||||
|
# Any entry whose last review is newer than this ISO cutoff is throttled.
|
||||||
|
cutoff = (scan_now - datetime.timedelta(days=min_age_days)).isoformat()
|
||||||
|
|
||||||
|
out: list[dict[str, Any]] = []
|
||||||
|
for canonical, rec in seen.items():
|
||||||
|
if rec.get("bucket") not in ("toimplement", "tocheck"):
|
||||||
|
continue
|
||||||
|
last_reviewed = rec.get("reconsidered_at") or rec.get("first_seen") or ""
|
||||||
|
if min_age_days > 0 and last_reviewed and last_reviewed > cutoff:
|
||||||
|
continue
|
||||||
|
cves: list[str] = []
|
||||||
|
if canonical.startswith("CVE-"):
|
||||||
|
cves.append(canonical)
|
||||||
|
for alt in by_canonical.get(canonical, []):
|
||||||
|
if alt.startswith("CVE-") and alt not in cves:
|
||||||
|
cves.append(alt)
|
||||||
|
out.append({
|
||||||
|
"canonical_id": canonical,
|
||||||
|
"current_bucket": rec.get("bucket"),
|
||||||
|
"title": rec.get("title") or "",
|
||||||
|
"sources": list(rec.get("sources") or []),
|
||||||
|
"urls": list(rec.get("urls") or []),
|
||||||
|
"extracted_cves": cves,
|
||||||
|
"first_seen": rec.get("first_seen"),
|
||||||
|
"reconsidered_at": rec.get("reconsidered_at"),
|
||||||
|
})
|
||||||
|
return out
|
||||||
|
|
||||||
|
|
||||||
|
def candidate_ids(item: dict[str, Any]) -> list[str]:
|
||||||
|
"""All identifiers under which this item might already be known."""
|
||||||
|
seen: set[str] = set()
|
||||||
|
out: list[str] = []
|
||||||
|
for cand in (
|
||||||
|
*(item.get("extracted_cves") or []),
|
||||||
|
*(item.get("vendor_ids") or []),
|
||||||
|
item.get("stable_id"),
|
||||||
|
item.get("guid"),
|
||||||
|
item.get("permalink"),
|
||||||
|
):
|
||||||
|
if cand and cand not in seen:
|
||||||
|
seen.add(cand)
|
||||||
|
out.append(cand)
|
||||||
|
return out
|
||||||
|
|
||||||
|
|
||||||
|
def main() -> int:
|
||||||
|
ap = argparse.ArgumentParser()
|
||||||
|
ap.add_argument("--scan-date", default=os.environ.get("SCAN_DATE", ""))
|
||||||
|
ap.add_argument("--output", type=pathlib.Path, default=NEW_ITEMS_PATH)
|
||||||
|
args = ap.parse_args()
|
||||||
|
|
||||||
|
scan_now = now_from_scan_date(args.scan_date)
|
||||||
|
scan_date_iso = scan_now.isoformat()
|
||||||
|
window_hours = _resolve_window_hours()
|
||||||
|
reconsider_age_days = _resolve_reconsider_age_days()
|
||||||
|
data = state.load()
|
||||||
|
cutoff = compute_cutoff(scan_now, data.get("last_run"), window_hours)
|
||||||
|
|
||||||
|
per_source: dict[str, dict[str, Any]] = {}
|
||||||
|
all_new: list[dict[str, Any]] = []
|
||||||
|
|
||||||
|
for src in SOURCES:
|
||||||
|
meta = dict(data["sources"].get(src.name, {}))
|
||||||
|
status, body, etag, last_modified = conditional_get(
|
||||||
|
src.url, meta.get("etag"), meta.get("last_modified"),
|
||||||
|
user_agent=src.user_agent or USER_AGENT,
|
||||||
|
)
|
||||||
|
meta["last_fetched_at"] = scan_date_iso
|
||||||
|
meta["last_status"] = status
|
||||||
|
|
||||||
|
if isinstance(status, str) or (isinstance(status, int) and status >= 400 and status != 304):
|
||||||
|
per_source[src.name] = {"status": status, "new": 0}
|
||||||
|
data["sources"][src.name] = meta
|
||||||
|
continue
|
||||||
|
|
||||||
|
if status == 304 or body is None:
|
||||||
|
per_source[src.name] = {"status": 304, "new": 0}
|
||||||
|
data["sources"][src.name] = meta
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Refresh cache headers only on successful 200.
|
||||||
|
if etag:
|
||||||
|
meta["etag"] = etag
|
||||||
|
if last_modified:
|
||||||
|
meta["last_modified"] = last_modified
|
||||||
|
|
||||||
|
items = parse_body(src, body)
|
||||||
|
total = len(items)
|
||||||
|
|
||||||
|
in_window = []
|
||||||
|
for it in items:
|
||||||
|
pub = parse_iso(it.get("published_at"))
|
||||||
|
if pub is None or pub >= cutoff:
|
||||||
|
in_window.append(it)
|
||||||
|
|
||||||
|
new: list[dict[str, Any]] = []
|
||||||
|
hwm_pub = meta.get("hwm_published_at")
|
||||||
|
hwm_id = meta.get("hwm_id")
|
||||||
|
for it in in_window:
|
||||||
|
if state.lookup(data, candidate_ids(it)) is not None:
|
||||||
|
continue
|
||||||
|
new.append(it)
|
||||||
|
pub = it.get("published_at")
|
||||||
|
if pub and (not hwm_pub or pub > hwm_pub):
|
||||||
|
hwm_pub = pub
|
||||||
|
hwm_id = it.get("stable_id")
|
||||||
|
|
||||||
|
if new:
|
||||||
|
meta["hwm_published_at"] = hwm_pub
|
||||||
|
meta["hwm_id"] = hwm_id
|
||||||
|
|
||||||
|
data["sources"][src.name] = meta
|
||||||
|
per_source[src.name] = {"status": status, "new": len(new), "total_in_feed": total}
|
||||||
|
all_new.extend(new)
|
||||||
|
|
||||||
|
# Persist updated HTTP cache metadata regardless of whether Claude runs.
|
||||||
|
state.save(data)
|
||||||
|
|
||||||
|
reconsider = backlog_to_reconsider(data, scan_now, reconsider_age_days)
|
||||||
|
|
||||||
|
out = {
|
||||||
|
"scan_date": scan_date_iso,
|
||||||
|
"window_cutoff": cutoff.isoformat(),
|
||||||
|
"per_source": per_source,
|
||||||
|
"items": all_new,
|
||||||
|
"reconsider": reconsider,
|
||||||
|
}
|
||||||
|
args.output.write_text(json.dumps(out, indent=2, sort_keys=True) + "\n")
|
||||||
|
|
||||||
|
# GitHub Actions step outputs. Downstream `if:` conditions gate the
|
||||||
|
# classify step on `new_count || reconsider_count`; both must be 0
|
||||||
|
# for Claude to be skipped.
|
||||||
|
gh_out = os.environ.get("GITHUB_OUTPUT")
|
||||||
|
if gh_out:
|
||||||
|
with open(gh_out, "a") as f:
|
||||||
|
f.write(f"new_count={len(all_new)}\n")
|
||||||
|
f.write(f"reconsider_count={len(reconsider)}\n")
|
||||||
|
failures = [
|
||||||
|
s for s, v in per_source.items()
|
||||||
|
if not (isinstance(v["status"], int) and v["status"] in (200, 304))
|
||||||
|
]
|
||||||
|
f.write(f"fetch_failures_count={len(failures)}\n")
|
||||||
|
|
||||||
|
print(f"Scan date: {scan_date_iso}")
|
||||||
|
print(f"Window: {window_hours:g} h")
|
||||||
|
print(f"Cutoff: {cutoff.isoformat()}")
|
||||||
|
print(f"New items: {len(all_new)}")
|
||||||
|
if reconsider_age_days == 0:
|
||||||
|
print(f"Reconsider: {len(reconsider)} (throttle disabled)")
|
||||||
|
else:
|
||||||
|
print(f"Reconsider: {len(reconsider)} (throttle: "
|
||||||
|
f"skip entries reviewed <{reconsider_age_days:g}d ago)")
|
||||||
|
for s, v in per_source.items():
|
||||||
|
print(f" {s:14s} status={str(v['status']):>16} new={v['new']}")
|
||||||
|
|
||||||
|
return 0
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
sys.exit(main())
|
||||||
@@ -0,0 +1,298 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""Merge Claude's classifications.json into state/seen.json.
|
||||||
|
|
||||||
|
Inputs:
|
||||||
|
state/seen.json (already has updated .sources from fetch_and_diff)
|
||||||
|
classifications.json (written by the Claude step; list of records)
|
||||||
|
new_items.json (fallback source of per-item metadata, if Claude
|
||||||
|
omitted urls/sources in a record)
|
||||||
|
|
||||||
|
Each classification record has shape:
|
||||||
|
{
|
||||||
|
"stable_id": "...", # required (the key used in new_items.json)
|
||||||
|
"canonical_id": "...", # optional; defaults to first extracted_cves, else stable_id
|
||||||
|
"bucket": "toimplement|tocheck|unrelated",
|
||||||
|
"extracted_cves": ["...", ...], # optional
|
||||||
|
"sources": ["...", ...], # optional
|
||||||
|
"urls": ["...", ...], # optional
|
||||||
|
"reconsider": true # optional; set by Claude for reconsidered
|
||||||
|
# backlog entries — merge overwrites
|
||||||
|
# the stored bucket (incl. demotions)
|
||||||
|
# instead of promoting
|
||||||
|
}
|
||||||
|
|
||||||
|
Behavior:
|
||||||
|
- For records WITHOUT `reconsider: true` (fresh items):
|
||||||
|
upsert seen[canonical_id], union sources/urls, promote bucket strength.
|
||||||
|
- For records WITH `reconsider: true` (previously-classified entries):
|
||||||
|
overwrite the stored bucket unconditionally (permits demotions), union
|
||||||
|
sources/urls. If Claude's canonical_id differs from the stable_id (the
|
||||||
|
previous canonical), rekey the seen entry under the new ID and leave
|
||||||
|
the old as an alias — used when a CVE has since been assigned to what
|
||||||
|
was previously a bare vendor-ID entry.
|
||||||
|
- For every alt_id in (stable_id, vendor_ids, extracted_cves) that differs
|
||||||
|
from canonical_id, set aliases[alt_id] = canonical_id.
|
||||||
|
- Update last_run to SCAN_DATE.
|
||||||
|
- Prune entries older than RETENTION_DAYS (180) before writing.
|
||||||
|
- Also writes the three daily watch_*.md files as stubs if Claude didn't run
|
||||||
|
(i.e. when new_items.json was empty and the classify step was skipped).
|
||||||
|
"""
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
import datetime
|
||||||
|
import json
|
||||||
|
import os
|
||||||
|
import pathlib
|
||||||
|
import sys
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
|
from . import state
|
||||||
|
|
||||||
|
|
||||||
|
RETENTION_DAYS = 180
|
||||||
|
NEW_ITEMS_PATH = pathlib.Path("new_items.json")
|
||||||
|
CLASSIFICATIONS_PATH = pathlib.Path("classifications.json")
|
||||||
|
|
||||||
|
|
||||||
|
def _load_json(path: pathlib.Path, default: Any) -> Any:
|
||||||
|
if not path.exists():
|
||||||
|
return default
|
||||||
|
return json.loads(path.read_text())
|
||||||
|
|
||||||
|
|
||||||
|
def _canonical(record: dict[str, Any], fallback_meta: dict[str, Any] | None) -> str:
|
||||||
|
if record.get("canonical_id"):
|
||||||
|
return record["canonical_id"]
|
||||||
|
cves = record.get("extracted_cves") or (fallback_meta or {}).get("extracted_cves") or []
|
||||||
|
if cves:
|
||||||
|
return cves[0]
|
||||||
|
return record["stable_id"]
|
||||||
|
|
||||||
|
|
||||||
|
def _alt_ids(record: dict[str, Any], fallback_meta: dict[str, Any] | None) -> list[str]:
|
||||||
|
ids: list[str] = []
|
||||||
|
ids.append(record.get("stable_id", ""))
|
||||||
|
ids.extend(record.get("extracted_cves") or [])
|
||||||
|
if fallback_meta:
|
||||||
|
ids.extend(fallback_meta.get("extracted_cves") or [])
|
||||||
|
ids.extend(fallback_meta.get("vendor_ids") or [])
|
||||||
|
guid = fallback_meta.get("guid")
|
||||||
|
if guid:
|
||||||
|
ids.append(guid)
|
||||||
|
link = fallback_meta.get("permalink")
|
||||||
|
if link:
|
||||||
|
ids.append(link)
|
||||||
|
return [i for i in ids if i]
|
||||||
|
|
||||||
|
|
||||||
|
def _unique(seq: list[str]) -> list[str]:
|
||||||
|
seen: set[str] = set()
|
||||||
|
out: list[str] = []
|
||||||
|
for x in seq:
|
||||||
|
if x and x not in seen:
|
||||||
|
seen.add(x)
|
||||||
|
out.append(x)
|
||||||
|
return out
|
||||||
|
|
||||||
|
|
||||||
|
def merge(
|
||||||
|
data: dict[str, Any],
|
||||||
|
classifications: list[dict[str, Any]],
|
||||||
|
new_items_by_stable_id: dict[str, dict[str, Any]],
|
||||||
|
scan_date: str,
|
||||||
|
) -> None:
|
||||||
|
for rec in classifications:
|
||||||
|
if not rec.get("stable_id"):
|
||||||
|
continue
|
||||||
|
if rec.get("reconsider"):
|
||||||
|
_apply_reconsider(data, rec, scan_date)
|
||||||
|
else:
|
||||||
|
_apply_new_item(data, rec, new_items_by_stable_id, scan_date)
|
||||||
|
|
||||||
|
|
||||||
|
def _apply_new_item(
|
||||||
|
data: dict[str, Any],
|
||||||
|
rec: dict[str, Any],
|
||||||
|
new_items_by_stable_id: dict[str, dict[str, Any]],
|
||||||
|
scan_date: str,
|
||||||
|
) -> None:
|
||||||
|
stable_id = rec["stable_id"]
|
||||||
|
meta = new_items_by_stable_id.get(stable_id, {})
|
||||||
|
canonical = _canonical(rec, meta)
|
||||||
|
bucket = rec.get("bucket", "unrelated")
|
||||||
|
title = (meta.get("title") or "").strip()
|
||||||
|
|
||||||
|
existing = data["seen"].get(canonical)
|
||||||
|
if existing is None:
|
||||||
|
data["seen"][canonical] = {
|
||||||
|
"bucket": bucket,
|
||||||
|
"first_seen": scan_date,
|
||||||
|
"seen_at": scan_date,
|
||||||
|
"title": title,
|
||||||
|
"sources": _unique(list(rec.get("sources") or []) + ([meta.get("source")] if meta.get("source") else [])),
|
||||||
|
"urls": _unique(list(rec.get("urls") or []) + ([meta.get("permalink")] if meta.get("permalink") else [])),
|
||||||
|
}
|
||||||
|
else:
|
||||||
|
existing["bucket"] = state.promote_bucket(existing["bucket"], bucket)
|
||||||
|
existing["seen_at"] = scan_date
|
||||||
|
existing.setdefault("first_seen", existing.get("seen_at") or scan_date)
|
||||||
|
if not existing.get("title") and title:
|
||||||
|
existing["title"] = title
|
||||||
|
existing["sources"] = _unique(list(existing.get("sources") or []) + list(rec.get("sources") or []) + ([meta.get("source")] if meta.get("source") else []))
|
||||||
|
existing["urls"] = _unique(list(existing.get("urls") or []) + list(rec.get("urls") or []) + ([meta.get("permalink")] if meta.get("permalink") else []))
|
||||||
|
|
||||||
|
for alt in _alt_ids(rec, meta):
|
||||||
|
if alt != canonical:
|
||||||
|
data["aliases"][alt] = canonical
|
||||||
|
|
||||||
|
|
||||||
|
def _apply_reconsider(
|
||||||
|
data: dict[str, Any],
|
||||||
|
rec: dict[str, Any],
|
||||||
|
scan_date: str,
|
||||||
|
) -> None:
|
||||||
|
"""Re-review of a previously-classified entry. The record's stable_id
|
||||||
|
is the entry's current canonical key in state; `canonical_id` may name
|
||||||
|
a new key (e.g. a freshly-assigned CVE) — in which case we rekey."""
|
||||||
|
old_key = rec["stable_id"]
|
||||||
|
new_canonical = _canonical(rec, None)
|
||||||
|
bucket = rec.get("bucket", "unrelated")
|
||||||
|
|
||||||
|
# Resolve the current record — may need to follow an alias if the
|
||||||
|
# backlog snapshot the classifier reviewed is slightly out of sync.
|
||||||
|
current_key = old_key if old_key in data["seen"] else data["aliases"].get(old_key)
|
||||||
|
if not current_key or current_key not in data["seen"]:
|
||||||
|
print(f"warning: reconsider record for {old_key!r} points at no "
|
||||||
|
f"state entry; skipping.", file=sys.stderr)
|
||||||
|
return
|
||||||
|
|
||||||
|
existing = data["seen"][current_key]
|
||||||
|
|
||||||
|
# Overwrite bucket unconditionally (allows demotions) and stamp the
|
||||||
|
# reconsideration date so we can later throttle if this grows.
|
||||||
|
existing["bucket"] = bucket
|
||||||
|
existing["seen_at"] = scan_date
|
||||||
|
existing["reconsidered_at"] = scan_date
|
||||||
|
|
||||||
|
# Union any fresh sources/urls the classifier surfaced.
|
||||||
|
if rec.get("sources"):
|
||||||
|
existing["sources"] = _unique(list(existing.get("sources") or []) + list(rec["sources"]))
|
||||||
|
if rec.get("urls"):
|
||||||
|
existing["urls"] = _unique(list(existing.get("urls") or []) + list(rec["urls"]))
|
||||||
|
|
||||||
|
# Alias every alt ID the classifier provided to the current key
|
||||||
|
# (before a possible rekey below redirects them).
|
||||||
|
for alt in _alt_ids(rec, None):
|
||||||
|
if alt != current_key:
|
||||||
|
data["aliases"][alt] = current_key
|
||||||
|
|
||||||
|
# Rekey if Claude newly identified a canonical ID (e.g., a CVE for a
|
||||||
|
# vendor-ID entry). If the destination already exists, merge; else
|
||||||
|
# move. In both cases, retarget all aliases and leave the old key
|
||||||
|
# itself as an alias.
|
||||||
|
if new_canonical and new_canonical != current_key:
|
||||||
|
if new_canonical in data["seen"]:
|
||||||
|
dest = data["seen"][new_canonical]
|
||||||
|
dest["bucket"] = state.promote_bucket(dest.get("bucket", "unrelated"), existing.get("bucket", "unrelated"))
|
||||||
|
dest["sources"] = _unique(list(dest.get("sources") or []) + list(existing.get("sources") or []))
|
||||||
|
dest["urls"] = _unique(list(dest.get("urls") or []) + list(existing.get("urls") or []))
|
||||||
|
if not dest.get("title") and existing.get("title"):
|
||||||
|
dest["title"] = existing["title"]
|
||||||
|
dest["seen_at"] = scan_date
|
||||||
|
dest["reconsidered_at"] = scan_date
|
||||||
|
dest.setdefault("first_seen", existing.get("first_seen") or scan_date)
|
||||||
|
del data["seen"][current_key]
|
||||||
|
else:
|
||||||
|
data["seen"][new_canonical] = existing
|
||||||
|
del data["seen"][current_key]
|
||||||
|
|
||||||
|
for alias_key, target in list(data["aliases"].items()):
|
||||||
|
if target == current_key:
|
||||||
|
data["aliases"][alias_key] = new_canonical
|
||||||
|
data["aliases"][current_key] = new_canonical
|
||||||
|
# Clean up any self-aliases the retarget may have produced.
|
||||||
|
for k in [k for k, v in data["aliases"].items() if k == v]:
|
||||||
|
del data["aliases"][k]
|
||||||
|
|
||||||
|
|
||||||
|
def ensure_stub_reports(scan_date: str) -> None:
|
||||||
|
"""If the Claude step was skipped, write empty stub watch_*.md files so the
|
||||||
|
report artifact is consistent across runs."""
|
||||||
|
day = scan_date[:10] # YYYY-MM-DD
|
||||||
|
stub = "(no new items in this window)\n"
|
||||||
|
for bucket in ("toimplement", "tocheck", "unrelated"):
|
||||||
|
p = pathlib.Path(f"watch_{day}_{bucket}.md")
|
||||||
|
if not p.exists():
|
||||||
|
p.write_text(stub)
|
||||||
|
|
||||||
|
|
||||||
|
def write_snapshots(data: dict[str, Any], scan_date: str) -> None:
|
||||||
|
"""Write current_toimplement.md and current_tocheck.md — full backlog
|
||||||
|
snapshots reflecting every entry in state under those buckets. A human
|
||||||
|
who reads only the latest run's artifact sees the complete picture
|
||||||
|
without having to consult prior runs."""
|
||||||
|
for bucket in ("toimplement", "tocheck"):
|
||||||
|
entries = [
|
||||||
|
(cid, rec) for cid, rec in data["seen"].items()
|
||||||
|
if rec.get("bucket") == bucket
|
||||||
|
]
|
||||||
|
# Oldest first — long-lingering items stay at the top as a reminder.
|
||||||
|
entries.sort(key=lambda kv: kv[1].get("first_seen") or kv[1].get("seen_at") or "")
|
||||||
|
out = [
|
||||||
|
f"# Current `{bucket}` backlog",
|
||||||
|
"",
|
||||||
|
f"_Snapshot as of {scan_date}. "
|
||||||
|
f"{len(entries)} item(s). Oldest first._",
|
||||||
|
"",
|
||||||
|
]
|
||||||
|
if not entries:
|
||||||
|
out.append("(backlog is empty)")
|
||||||
|
else:
|
||||||
|
for cid, rec in entries:
|
||||||
|
title = rec.get("title") or ""
|
||||||
|
first_seen = (rec.get("first_seen") or rec.get("seen_at") or "")[:10]
|
||||||
|
sources = ", ".join(rec.get("sources") or []) or "(none)"
|
||||||
|
out.append(f"- **{cid}**" + (f" — {title}" if title else ""))
|
||||||
|
out.append(f" first seen {first_seen} · sources: {sources}")
|
||||||
|
for u in rec.get("urls") or []:
|
||||||
|
out.append(f" - {u}")
|
||||||
|
out.append("")
|
||||||
|
pathlib.Path(f"current_{bucket}.md").write_text("\n".join(out))
|
||||||
|
|
||||||
|
|
||||||
|
def main() -> int:
|
||||||
|
ap = argparse.ArgumentParser()
|
||||||
|
ap.add_argument("--scan-date", default=os.environ.get("SCAN_DATE", ""))
|
||||||
|
ap.add_argument("--classifications", type=pathlib.Path, default=CLASSIFICATIONS_PATH)
|
||||||
|
ap.add_argument("--new-items", type=pathlib.Path, default=NEW_ITEMS_PATH)
|
||||||
|
args = ap.parse_args()
|
||||||
|
|
||||||
|
scan_date = args.scan_date or datetime.datetime.now(datetime.timezone.utc).isoformat()
|
||||||
|
|
||||||
|
data = state.load()
|
||||||
|
classifications = _load_json(args.classifications, [])
|
||||||
|
new_items_doc = _load_json(args.new_items, {"items": []})
|
||||||
|
new_items_by_stable_id = {it["stable_id"]: it for it in new_items_doc.get("items", []) if it.get("stable_id")}
|
||||||
|
|
||||||
|
if not isinstance(classifications, list):
|
||||||
|
print(f"warning: {args.classifications} is not a list; ignoring", file=sys.stderr)
|
||||||
|
classifications = []
|
||||||
|
|
||||||
|
merge(data, classifications, new_items_by_stable_id, scan_date)
|
||||||
|
data["last_run"] = scan_date
|
||||||
|
|
||||||
|
scan_now = datetime.datetime.fromisoformat(scan_date.replace("Z", "+00:00"))
|
||||||
|
before, after = state.prune(data, RETENTION_DAYS, scan_now)
|
||||||
|
state.save(data)
|
||||||
|
ensure_stub_reports(scan_date)
|
||||||
|
write_snapshots(data, scan_date)
|
||||||
|
|
||||||
|
print(f"Merged {len(classifications)} classifications.")
|
||||||
|
print(f"Pruned seen: {before} -> {after} entries (retention={RETENTION_DAYS}d).")
|
||||||
|
print(f"Aliases: {len(data['aliases'])}.")
|
||||||
|
return 0
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
sys.exit(main())
|
||||||
@@ -0,0 +1,59 @@
|
|||||||
|
"""Declarative list of sources polled by the daily vuln scan."""
|
||||||
|
from dataclasses import dataclass
|
||||||
|
from typing import Literal
|
||||||
|
|
||||||
|
Kind = Literal["rss", "atom", "html"]
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True)
|
||||||
|
class Source:
|
||||||
|
name: str
|
||||||
|
url: str
|
||||||
|
kind: Kind
|
||||||
|
# For HTML sources: regexes used to extract advisory IDs from the page.
|
||||||
|
advisory_id_patterns: tuple[str, ...] = ()
|
||||||
|
# Human-facing URL to use as permalink fallback when `url` points at a
|
||||||
|
# non-browsable endpoint (e.g. a JS data file). Empty = use `url`.
|
||||||
|
display_url: str = ""
|
||||||
|
# Per-source UA override. AMD's CDN drops connections when the UA string
|
||||||
|
# contains a parenthesized URL, while Intel/ARM's WAF rejects UAs that
|
||||||
|
# don't identify themselves — so we can't use one UA everywhere.
|
||||||
|
# Empty = use the module-level USER_AGENT.
|
||||||
|
user_agent: str = ""
|
||||||
|
|
||||||
|
|
||||||
|
SOURCES: tuple[Source, ...] = (
|
||||||
|
Source("phoronix", "https://www.phoronix.com/rss.php", "rss"),
|
||||||
|
Source("oss-sec", "https://seclists.org/rss/oss-sec.rss", "rss"),
|
||||||
|
Source("lwn", "https://lwn.net/headlines/newrss", "rss"),
|
||||||
|
Source("project-zero", "https://googleprojectzero.blogspot.com/feeds/posts/default", "atom"),
|
||||||
|
Source("vusec", "https://www.vusec.net/feed/", "rss"),
|
||||||
|
Source("comsec-eth", "https://comsec.ethz.ch/category/news/feed/", "rss"),
|
||||||
|
# api.msrc.microsoft.com/update-guide/rss is the real RSS endpoint; the
|
||||||
|
# msrc.microsoft.com/... URL returns the SPA shell (2.7 KB) instead.
|
||||||
|
Source("msrc", "https://api.msrc.microsoft.com/update-guide/rss", "rss"),
|
||||||
|
Source("cisa", "https://www.cisa.gov/cybersecurity-advisories/all.xml", "rss"),
|
||||||
|
Source("cert-cc", "https://www.kb.cert.org/vuls/atomfeed/", "atom"),
|
||||||
|
Source("intel-psirt", "https://www.intel.com/content/www/us/en/security-center/default.html", "html",
|
||||||
|
(r"INTEL-SA-\d+",)),
|
||||||
|
Source("amd-psirt", "https://www.amd.com/en/resources/product-security.html", "html",
|
||||||
|
(r"AMD-SB-\d+",),
|
||||||
|
user_agent="spectre-meltdown-checker/vuln-watch"),
|
||||||
|
Source("arm-spec", "https://developer.arm.com/Arm%20Security%20Center/Speculative%20Processor%20Vulnerability", "html",
|
||||||
|
(r"CVE-\d{4}-\d{4,7}",)),
|
||||||
|
# transient.fail renders its attack table from tree.js client-side; we
|
||||||
|
# pull the JS file directly (CVE regex works on its JSON-ish body).
|
||||||
|
Source("transient-fail", "https://transient.fail/tree.js", "html",
|
||||||
|
(r"CVE-\d{4}-\d{4,7}",),
|
||||||
|
display_url="https://transient.fail/"),
|
||||||
|
)
|
||||||
|
|
||||||
|
# Identify ourselves honestly. Akamai/Cloudflare WAFs fronting intel.com,
|
||||||
|
# developer.arm.com, and cisa.gov return 403 when the UA claims "Mozilla"
|
||||||
|
# but TLS/HTTP fingerprint doesn't match a real browser — an honest bot UA
|
||||||
|
# passes those rules cleanly.
|
||||||
|
USER_AGENT = (
|
||||||
|
"spectre-meltdown-checker/vuln-watch "
|
||||||
|
"(+https://github.com/speed47/spectre-meltdown-checker)"
|
||||||
|
)
|
||||||
|
REQUEST_TIMEOUT = 30
|
||||||
@@ -0,0 +1,137 @@
|
|||||||
|
"""Load/save/migrate/lookup helpers for state/seen.json.
|
||||||
|
|
||||||
|
Schema v2:
|
||||||
|
{
|
||||||
|
"schema_version": 2,
|
||||||
|
"last_run": "<iso8601>|null",
|
||||||
|
"sources": {
|
||||||
|
"<name>": {
|
||||||
|
"etag": "...",
|
||||||
|
"last_modified": "...",
|
||||||
|
"hwm_id": "...",
|
||||||
|
"hwm_published_at": "<iso8601>",
|
||||||
|
"last_fetched_at": "<iso8601>",
|
||||||
|
"last_status": 200|304|<http-err>|"<str-err>"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"seen": {
|
||||||
|
"<canonical_id>": {
|
||||||
|
"bucket": "toimplement|tocheck|unrelated",
|
||||||
|
"seen_at": "<iso8601>",
|
||||||
|
"sources": ["<source-name>", ...],
|
||||||
|
"urls": ["<permalink>", ...]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"aliases": { "<alt_id>": "<canonical_id>" }
|
||||||
|
}
|
||||||
|
"""
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import datetime
|
||||||
|
import json
|
||||||
|
import pathlib
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
|
|
||||||
|
STATE_PATH = pathlib.Path("state/seen.json")
|
||||||
|
SCHEMA_VERSION = 2
|
||||||
|
|
||||||
|
|
||||||
|
def empty() -> dict[str, Any]:
|
||||||
|
return {
|
||||||
|
"schema_version": SCHEMA_VERSION,
|
||||||
|
"last_run": None,
|
||||||
|
"sources": {},
|
||||||
|
"seen": {},
|
||||||
|
"aliases": {},
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def load(path: pathlib.Path = STATE_PATH) -> dict[str, Any]:
|
||||||
|
if not path.exists():
|
||||||
|
# Fallback: a committed bootstrap seed, used to bridge a workflow
|
||||||
|
# rename (old workflow_id's artifacts are invisible to the new one).
|
||||||
|
# Remove the bootstrap file once one successful run has produced a
|
||||||
|
# normal artifact, otherwise it will shadow any future first-run.
|
||||||
|
bootstrap = path.parent / f"{path.name}.bootstrap"
|
||||||
|
if bootstrap.exists():
|
||||||
|
print(f"state: seeding from {bootstrap} (no prior-run artifact found)")
|
||||||
|
path = bootstrap
|
||||||
|
if not path.exists():
|
||||||
|
return empty()
|
||||||
|
data = json.loads(path.read_text())
|
||||||
|
return _migrate(data)
|
||||||
|
|
||||||
|
|
||||||
|
def save(data: dict[str, Any], path: pathlib.Path = STATE_PATH) -> None:
|
||||||
|
path.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
path.write_text(json.dumps(data, indent=2, sort_keys=True) + "\n")
|
||||||
|
|
||||||
|
|
||||||
|
def _migrate(data: dict[str, Any]) -> dict[str, Any]:
|
||||||
|
"""Bring any older schema up to SCHEMA_VERSION."""
|
||||||
|
version = data.get("schema_version")
|
||||||
|
if version == SCHEMA_VERSION:
|
||||||
|
data.setdefault("sources", {})
|
||||||
|
data.setdefault("aliases", {})
|
||||||
|
data.setdefault("seen", {})
|
||||||
|
return data
|
||||||
|
|
||||||
|
# v1 shape: {"last_run": ..., "seen": {<id>: {bucket, seen_at, source, cve?}}}
|
||||||
|
migrated_seen: dict[str, Any] = {}
|
||||||
|
aliases: dict[str, str] = {}
|
||||||
|
for key, entry in (data.get("seen") or {}).items():
|
||||||
|
rec = {
|
||||||
|
"bucket": entry.get("bucket", "unrelated"),
|
||||||
|
"seen_at": entry.get("seen_at"),
|
||||||
|
"sources": [entry["source"]] if entry.get("source") else [],
|
||||||
|
"urls": [key] if isinstance(key, str) and key.startswith("http") else [],
|
||||||
|
}
|
||||||
|
migrated_seen[key] = rec
|
||||||
|
# If a v1 entry had a CVE that differs from the key, alias the CVE -> key.
|
||||||
|
cve = entry.get("cve")
|
||||||
|
if cve and cve != key:
|
||||||
|
aliases[cve] = key
|
||||||
|
|
||||||
|
return {
|
||||||
|
"schema_version": SCHEMA_VERSION,
|
||||||
|
"last_run": data.get("last_run"),
|
||||||
|
"sources": {},
|
||||||
|
"seen": migrated_seen,
|
||||||
|
"aliases": aliases,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def lookup(data: dict[str, Any], candidate_ids: list[str]) -> str | None:
|
||||||
|
"""Return the canonical key if any candidate is already known, else None."""
|
||||||
|
seen = data["seen"]
|
||||||
|
aliases = data["aliases"]
|
||||||
|
for cid in candidate_ids:
|
||||||
|
if not cid:
|
||||||
|
continue
|
||||||
|
if cid in seen:
|
||||||
|
return cid
|
||||||
|
canonical = aliases.get(cid)
|
||||||
|
if canonical and canonical in seen:
|
||||||
|
return canonical
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
_BUCKET_STRENGTH = {"unrelated": 0, "tocheck": 1, "toimplement": 2}
|
||||||
|
|
||||||
|
|
||||||
|
def promote_bucket(current: str, incoming: str) -> str:
|
||||||
|
"""Return whichever of two buckets represents the 'stronger' classification."""
|
||||||
|
return incoming if _BUCKET_STRENGTH.get(incoming, 0) > _BUCKET_STRENGTH.get(current, 0) else current
|
||||||
|
|
||||||
|
|
||||||
|
def prune(data: dict[str, Any], days: int, now: datetime.datetime) -> tuple[int, int]:
|
||||||
|
"""Drop seen entries older than `days`, and aliases pointing at dropped keys."""
|
||||||
|
cutoff = (now - datetime.timedelta(days=days)).isoformat()
|
||||||
|
before = len(data["seen"])
|
||||||
|
data["seen"] = {
|
||||||
|
k: v for k, v in data["seen"].items()
|
||||||
|
if (v.get("seen_at") or "9999") >= cutoff
|
||||||
|
}
|
||||||
|
data["aliases"] = {k: v for k, v in data["aliases"].items() if v in data["seen"]}
|
||||||
|
return before, len(data["seen"])
|
||||||
@@ -1,380 +0,0 @@
|
|||||||
#! /bin/sh
|
|
||||||
# Spectre & Meltdown checker
|
|
||||||
# Stephane Lesimple
|
|
||||||
VERSION=0.14
|
|
||||||
|
|
||||||
# print status function
|
|
||||||
pstatus()
|
|
||||||
{
|
|
||||||
case "$1" in
|
|
||||||
red) col="\033[101m\033[30m";;
|
|
||||||
green) col="\033[102m\033[30m";;
|
|
||||||
yellow) col="\033[103m\033[30m";;
|
|
||||||
*) col="";;
|
|
||||||
esac
|
|
||||||
/bin/echo -ne "$col $2 \033[0m"
|
|
||||||
[ -n "$3" ] && /bin/echo -n " ($3)"
|
|
||||||
/bin/echo
|
|
||||||
}
|
|
||||||
|
|
||||||
# The 3 below functions are taken from the extract-linux script, available here:
|
|
||||||
# https://github.com/torvalds/linux/blob/master/scripts/extract-vmlinux
|
|
||||||
# The functions have been modified for better integration to this script
|
|
||||||
# The original header of the file has been retained below
|
|
||||||
|
|
||||||
# ----------------------------------------------------------------------
|
|
||||||
# extract-vmlinux - Extract uncompressed vmlinux from a kernel image
|
|
||||||
#
|
|
||||||
# Inspired from extract-ikconfig
|
|
||||||
# (c) 2009,2010 Dick Streefland <dick@streefland.net>
|
|
||||||
#
|
|
||||||
# (c) 2011 Corentin Chary <corentin.chary@gmail.com>
|
|
||||||
#
|
|
||||||
# Licensed under the GNU General Public License, version 2 (GPLv2).
|
|
||||||
# ----------------------------------------------------------------------
|
|
||||||
|
|
||||||
check_vmlinux()
|
|
||||||
{
|
|
||||||
file "$1" 2>/dev/null | grep -q ELF || return 1
|
|
||||||
return 0
|
|
||||||
}
|
|
||||||
|
|
||||||
try_decompress()
|
|
||||||
{
|
|
||||||
# The obscure use of the "tr" filter is to work around older versions of
|
|
||||||
# "grep" that report the byte offset of the line instead of the pattern.
|
|
||||||
|
|
||||||
# Try to find the header ($1) and decompress from here
|
|
||||||
for pos in `tr "$1\n$2" "\n$2=" < "$4" | grep -abo "^$2"`
|
|
||||||
do
|
|
||||||
pos=${pos%%:*}
|
|
||||||
tail -c+$pos "$4" | $3 > $vmlinuxtmp 2> /dev/null
|
|
||||||
check_vmlinux "$vmlinuxtmp" && echo "$vmlinuxtmp" && return 0
|
|
||||||
done
|
|
||||||
return 1
|
|
||||||
}
|
|
||||||
|
|
||||||
extract_vmlinux()
|
|
||||||
{
|
|
||||||
[ -n "$1" ] || return 1
|
|
||||||
# Prepare temp files:
|
|
||||||
vmlinuxtmp="$(mktemp /tmp/vmlinux-XXX)"
|
|
||||||
|
|
||||||
# Initial attempt for uncompressed images or objects:
|
|
||||||
if check_vmlinux "$1"; then
|
|
||||||
cat "$1" > "$vmlinuxtmp"
|
|
||||||
echo "$vmlinuxtmp"
|
|
||||||
return 0
|
|
||||||
fi
|
|
||||||
|
|
||||||
# That didn't work, so retry after decompression.
|
|
||||||
try_decompress '\037\213\010' xy gunzip "$1" && return 0
|
|
||||||
try_decompress '\3757zXZ\000' abcde unxz "$1" && return 0
|
|
||||||
try_decompress 'BZh' xy bunzip2 "$1" && return 0
|
|
||||||
try_decompress '\135\0\0\0' xxx unlzma "$1" && return 0
|
|
||||||
try_decompress '\211\114\132' xy 'lzop -d' "$1" && return 0
|
|
||||||
return 1
|
|
||||||
}
|
|
||||||
|
|
||||||
# end of extract-vmlinux functions
|
|
||||||
|
|
||||||
/bin/echo -e "\033[1;34mSpectre and Meltdown mitigation detection tool v$VERSION\033[0m"
|
|
||||||
/bin/echo
|
|
||||||
|
|
||||||
# root check
|
|
||||||
|
|
||||||
if [ "$(id -u)" -ne 0 ]; then
|
|
||||||
/bin/echo -e "\033[31mNote that you should launch this script with root privileges to get accurate information.\033[0m"
|
|
||||||
/bin/echo -e "\033[31mWe'll proceed but you might see permission denied errors.\033[0m"
|
|
||||||
/bin/echo -e "\033[31mTo run it as root, you can try the following command: sudo $0\033[0m"
|
|
||||||
/bin/echo
|
|
||||||
fi
|
|
||||||
|
|
||||||
/bin/echo -e "Checking vulnerabilities against \033[35m"$(uname -s) $(uname -r) $(uname -v) $(uname -m)"\033[0m"
|
|
||||||
/bin/echo
|
|
||||||
|
|
||||||
###########
|
|
||||||
# SPECTRE 1
|
|
||||||
/bin/echo -e "\033[1;34mCVE-2017-5753 [bounds check bypass] aka 'Spectre Variant 1'\033[0m"
|
|
||||||
/bin/echo -n "* Kernel compiled with LFENCE opcode inserted at the proper places: "
|
|
||||||
|
|
||||||
status=0
|
|
||||||
img=''
|
|
||||||
# try to find the image of the current running kernel
|
|
||||||
[ -e /boot/vmlinuz-linux ] && img=/boot/vmlinuz-linux
|
|
||||||
[ -e /boot/vmlinuz-$(uname -r) ] && img=/boot/vmlinuz-$(uname -r)
|
|
||||||
[ -e /boot/kernel-$( uname -r) ] && img=/boot/kernel-$( uname -r)
|
|
||||||
[ -e /boot/bzImage-$(uname -r) ] && img=/boot/bzImage-$(uname -r)
|
|
||||||
[ -e /boot/kernel-genkernel-$(uname -m)-$(uname -r) ] && img=/boot/kernel-genkernel-$(uname -m)-$(uname -r)
|
|
||||||
if [ -z "$img" ]; then
|
|
||||||
pstatus yellow UNKNOWN "couldn't find your kernel image in /boot, if you used netboot, this is normal"
|
|
||||||
else
|
|
||||||
vmlinux=$(extract_vmlinux $img)
|
|
||||||
if [ -z "$vmlinux" -o ! -r "$vmlinux" ]; then
|
|
||||||
pstatus yellow UNKNOWN "couldn't extract your kernel from $img"
|
|
||||||
elif ! which objdump >/dev/null 2>&1; then
|
|
||||||
pstatus yellow UNKNOWN "missing 'objdump' tool, please install it, usually it's in the binutils package"
|
|
||||||
else
|
|
||||||
# here we disassemble the kernel and count the number of occurences of the LFENCE opcode
|
|
||||||
# in non-patched kernels, this has been empirically determined as being around 40-50
|
|
||||||
# in patched kernels, this is more around 70-80, sometimes way higher (100+)
|
|
||||||
# v0.13: 68 found in a 3.10.23-xxxx-std-ipv6-64 (with lots of modules compiled-in directly), which doesn't have the LFENCE patches,
|
|
||||||
# so let's push the threshold to 70.
|
|
||||||
# TODO LKML patch is starting to dump LFENCE in favor of the PAUSE opcode, we might need to check that (patch not stabilized yet)
|
|
||||||
nb_lfence=$(objdump -D "$vmlinux" | grep -wc lfence)
|
|
||||||
if [ "$nb_lfence" -lt 70 ]; then
|
|
||||||
pstatus red NO "only $nb_lfence opcodes found, should be >= 70"
|
|
||||||
status=1
|
|
||||||
else
|
|
||||||
pstatus green YES "$nb_lfence opcodes found, which is >= 70"
|
|
||||||
status=2
|
|
||||||
fi
|
|
||||||
fi
|
|
||||||
fi
|
|
||||||
|
|
||||||
/bin/echo -ne "> \033[46m\033[30mSTATUS:\033[0m "
|
|
||||||
[ "$status" = 0 ] && pstatus yellow UNKNOWN
|
|
||||||
[ "$status" = 1 ] && pstatus red VULNERABLE
|
|
||||||
[ "$status" = 2 ] && pstatus green 'NOT VULNERABLE'
|
|
||||||
|
|
||||||
###########
|
|
||||||
# VARIANT 2
|
|
||||||
/bin/echo
|
|
||||||
/bin/echo -e "\033[1;34mCVE-2017-5715 [branch target injection] aka 'Spectre Variant 2'\033[0m"
|
|
||||||
/bin/echo "* Mitigation 1"
|
|
||||||
/bin/echo -n "* Hardware (CPU microcode) support for mitigation: "
|
|
||||||
if [ ! -e /dev/cpu/0/msr ]; then
|
|
||||||
# try to load the module ourselves (and remember it so we can rmmod it afterwards)
|
|
||||||
modprobe msr 2>/dev/null && insmod_msr=1
|
|
||||||
fi
|
|
||||||
if [ ! -e /dev/cpu/0/msr ]; then
|
|
||||||
pstatus yellow UNKNOWN "couldn't read /dev/cpu/0/msr, is msr support enabled in your kernel?"
|
|
||||||
else
|
|
||||||
# the new MSR 'SPEC_CTRL' is at offset 0x48
|
|
||||||
# here we use dd, it's the same as using 'rdmsr 0x48' but without needing the rdmsr tool
|
|
||||||
# if we get a read error, the MSR is not there
|
|
||||||
dd if=/dev/cpu/0/msr of=/dev/null bs=8 count=1 skip=9 2>/dev/null
|
|
||||||
if [ $? -eq 0 ]; then
|
|
||||||
pstatus green YES
|
|
||||||
else
|
|
||||||
pstatus red NO
|
|
||||||
fi
|
|
||||||
fi
|
|
||||||
|
|
||||||
if [ "$insmod_msr" = 1 ]; then
|
|
||||||
# if we used modprobe ourselves, rmmod the module
|
|
||||||
rmmod msr 2>/dev/null
|
|
||||||
fi
|
|
||||||
|
|
||||||
/bin/echo -n "* Kernel support for IBRS: "
|
|
||||||
if [ ! -e /sys/kernel/debug/sched_features ]; then
|
|
||||||
# try to mount the debugfs hierarchy ourselves and remember it to umount afterwards
|
|
||||||
mount -t debugfs debugfs /sys/kernel/debug 2>/dev/null && mounted_debugfs=1
|
|
||||||
fi
|
|
||||||
if [ -e /sys/kernel/debug/ibrs_enabled ]; then
|
|
||||||
# if the file is there, we have IBRS compiled-in
|
|
||||||
pstatus green YES
|
|
||||||
ibrs_supported=1
|
|
||||||
ibrs_enabled=$(cat /sys/kernel/debug/ibrs_enabled 2>/dev/null)
|
|
||||||
elif [ -e /sys/kernel/debug/x86/ibrs_enabled ]; then
|
|
||||||
# RedHat uses a different path (see https://access.redhat.com/articles/3311301)
|
|
||||||
pstatus green YES
|
|
||||||
ibrs_supported=1
|
|
||||||
ibrs_enabled=$(cat /sys/kernel/debug/x86/ibrs_enabled 2>/dev/null)
|
|
||||||
else
|
|
||||||
pstatus red NO
|
|
||||||
fi
|
|
||||||
|
|
||||||
/bin/echo -n "* IBRS enabled for Kernel space: "
|
|
||||||
# 0 means disabled
|
|
||||||
# 1 is enabled only for kernel space
|
|
||||||
# 2 is enabled for kernel and user space
|
|
||||||
case "$ibrs_enabled" in
|
|
||||||
"") [ "$ibrs_supported" = 1 ] && pstatus yellow UNKNOWN || pstatus red NO;;
|
|
||||||
0) pstatus red NO;;
|
|
||||||
1 | 2) pstatus green YES;;
|
|
||||||
*) pstatus yellow UNKNOWN;;
|
|
||||||
esac
|
|
||||||
|
|
||||||
/bin/echo -n "* IBRS enabled for User space: "
|
|
||||||
case "$ibrs_enabled" in
|
|
||||||
"") [ "$ibrs_supported" = 1 ] && pstatus yellow UNKNOWN || pstatus red NO;;
|
|
||||||
0 | 1) pstatus red NO;;
|
|
||||||
2) pstatus green YES;;
|
|
||||||
*) pstatus yellow UNKNOWN;;
|
|
||||||
esac
|
|
||||||
|
|
||||||
/bin/echo "* Mitigation 2"
|
|
||||||
/bin/echo -n "* Kernel compiled with retpoline option: "
|
|
||||||
# We check the RETPOLINE kernel options
|
|
||||||
if [ -e /proc/config.gz ]; then
|
|
||||||
# either the running kernel exports his own config
|
|
||||||
if zgrep -q '^CONFIG_RETPOLINE=y' /proc/config.gz; then
|
|
||||||
pstatus green YES
|
|
||||||
retpoline=1
|
|
||||||
else
|
|
||||||
pstatus red NO
|
|
||||||
fi
|
|
||||||
elif [ -e /boot/config-$(uname -r) ]; then
|
|
||||||
# or we can find a config file in /root with the kernel release name
|
|
||||||
if grep -q '^CONFIG_RETPOLINE=y' /boot/config-$(uname -r); then
|
|
||||||
pstatus green YES
|
|
||||||
retpoline=1
|
|
||||||
else
|
|
||||||
pstatus red NO
|
|
||||||
fi
|
|
||||||
else
|
|
||||||
pstatus yellow UNKNOWN "couldn't read your kernel configuration"
|
|
||||||
fi
|
|
||||||
|
|
||||||
/bin/echo -n "* Kernel compiled with a retpoline-aware compiler: "
|
|
||||||
# Now check if the compiler used to compile the kernel knows how to insert retpolines in generated asm
|
|
||||||
# For gcc, this is -mindirect-branch=thunk-extern (detected by the kernel makefiles)
|
|
||||||
# See gcc commit https://github.com/hjl-tools/gcc/commit/23b517d4a67c02d3ef80b6109218f2aadad7bd79
|
|
||||||
# We'll look for the presence of 'retpoline_call_target' in symbols
|
|
||||||
if [ -n "$vmlinux" ]; then
|
|
||||||
# look for the symbol
|
|
||||||
if which nm >/dev/null 2>&1; then
|
|
||||||
# the proper way: use nm and look for the symbol
|
|
||||||
if nm "$vmlinux" 2>/dev/null | grep -qw retpoline_call_target; then
|
|
||||||
retpoline_compiler=1
|
|
||||||
pstatus green YES "retpoline_call_target found"
|
|
||||||
fi
|
|
||||||
else
|
|
||||||
# if we don't have nm, nevermind, the symbol name is long enough to not have
|
|
||||||
# any false positive using good old grep directly on the binary
|
|
||||||
if grep -q retpoline_call_target "$vmlinux"; then
|
|
||||||
retpoline_compiler=1
|
|
||||||
pstatus green YES "retpoline_call_target found"
|
|
||||||
fi
|
|
||||||
fi
|
|
||||||
if [ "$retpoline_compiler" != 1 ]; then
|
|
||||||
# still not ? maybe we just don't have symbols in the kernel image (stripped)
|
|
||||||
# let's objdump it and look for the asm sequence (here for 64 bits)
|
|
||||||
#
|
|
||||||
# ffffffff81000350 <__x86.indirect_thunk>:
|
|
||||||
# ffffffff81000350: e8 05 00 00 00 callq ffffffff8100035a <retpoline_call_target>
|
|
||||||
# ffffffff81000355: 0f ae e8 lfence
|
|
||||||
# ffffffff81000358: eb fb jmp ffffffff81000355 <__x86.indirect_thunk+0x5>
|
|
||||||
|
|
||||||
# ffffffff8100035a <retpoline_call_target>:
|
|
||||||
# ffffffff8100035a: 48 8d 64 24 08 lea 0x8(%rsp),%rsp
|
|
||||||
# ffffffff8100035f: c3 retq
|
|
||||||
#
|
|
||||||
if ! which perl >/dev/null 2>&1; then
|
|
||||||
pstatus yellow UNKNOWN "missing 'perl', please install it"
|
|
||||||
else
|
|
||||||
# directly look for the opcode sequence in the binary
|
|
||||||
# 64 bits version
|
|
||||||
if perl -ne '/\xe8\x05\x00\x00\x00\x0f\xae\xe8\xeb\xfb\x48\x8d\x64\x24\x08\xc3/ and $found=1; END { exit($found ? 0 : 1) }' "$vmlinux"; then
|
|
||||||
retpoline_compiler=1
|
|
||||||
pstatus green YES "retpoline 64 bits asm sequence found"
|
|
||||||
#elif perl -ne ... 32 bits version of retpoline asm seq
|
|
||||||
# TODO
|
|
||||||
# retpoline_compiler=1
|
|
||||||
# pstatus green YES "retpoline 33 bits asm sequence found"
|
|
||||||
else
|
|
||||||
pstatus red NO
|
|
||||||
fi
|
|
||||||
fi
|
|
||||||
fi
|
|
||||||
else
|
|
||||||
pstatus yellow UNKNOWN "couldn't find your kernel image"
|
|
||||||
fi
|
|
||||||
|
|
||||||
|
|
||||||
/bin/echo -ne "> \033[46m\033[30mSTATUS:\033[0m "
|
|
||||||
if grep -q AMD /proc/cpuinfo; then
|
|
||||||
pstatus green "NOT VULNERABLE" "your CPU is not vulnerable as per the vendor"
|
|
||||||
elif [ "$ibrs_enabled" = 1 -o "$ibrs_enabled" = 2 ]; then
|
|
||||||
pstatus green "NOT VULNERABLE" "IBRS mitigates the vulnerability"
|
|
||||||
elif [ "$retpoline" = 1 -a "$retpoline_compiler" = 1 ]; then
|
|
||||||
pstatus green "NOT VULNERABLE" "retpoline mitigate the vulnerability"
|
|
||||||
else
|
|
||||||
pstatus red VULNERABLE "IBRS hardware + kernel support OR kernel with retpoline are needed to mitigate the vulnerability"
|
|
||||||
fi
|
|
||||||
|
|
||||||
##########
|
|
||||||
# MELTDOWN
|
|
||||||
/bin/echo
|
|
||||||
/bin/echo -e "\033[1;34mCVE-2017-5754 [rogue data cache load] aka 'Meltdown' aka 'Variant 3'\033[0m"
|
|
||||||
/bin/echo -n "* Kernel supports Page Table Isolation (PTI): "
|
|
||||||
kpti_support=0
|
|
||||||
kpti_can_tell=0
|
|
||||||
if [ -e /proc/config.gz ]; then
|
|
||||||
# either the running kernel exports his own config
|
|
||||||
kpti_can_tell=1
|
|
||||||
if zgrep -q '^\(CONFIG_PAGE_TABLE_ISOLATION=y\|CONFIG_KAISER=y\)' /proc/config.gz; then
|
|
||||||
kpti_support=1
|
|
||||||
fi
|
|
||||||
elif [ -e /boot/config-$(uname -r) ]; then
|
|
||||||
# or we can find a config file in /root with the kernel release name
|
|
||||||
kpti_can_tell=1
|
|
||||||
if grep -q '^\(CONFIG_PAGE_TABLE_ISOLATION=y\|CONFIG_KAISER=y\)' /boot/config-$(uname -r); then
|
|
||||||
kpti_support=1
|
|
||||||
fi
|
|
||||||
fi
|
|
||||||
if [ -e /boot/System.map-$(uname -r) ]; then
|
|
||||||
# it's not an elif: some backports don't have the PTI config but still include the patch
|
|
||||||
# so we try to find an exported symbol that is part of the PTI patch in System.map
|
|
||||||
kpti_can_tell=1
|
|
||||||
if grep -qw kpti_force_enabled /boot/System.map-$(uname -r); then
|
|
||||||
kpti_support=1
|
|
||||||
fi
|
|
||||||
fi
|
|
||||||
if [ -n "$vmlinux" ]; then
|
|
||||||
# same as above but in case we don't have System.map and only vmlinux, look for the
|
|
||||||
# nopti option that is part of the patch (kernel command line option)
|
|
||||||
kpti_can_tell=1
|
|
||||||
if strings "$vmlinux" | grep -qw nopti; then
|
|
||||||
kpti_support=1
|
|
||||||
fi
|
|
||||||
fi
|
|
||||||
|
|
||||||
if [ "$kpti_support" = 1 ]; then
|
|
||||||
pstatus green YES
|
|
||||||
elif [ "$kpti_can_tell" = 1 ]; then
|
|
||||||
pstatus red NO
|
|
||||||
else
|
|
||||||
pstatus yellow UNKNOWN "couldn't read your kernel configuration"
|
|
||||||
fi
|
|
||||||
|
|
||||||
/bin/echo -n "* PTI enabled and active: "
|
|
||||||
if grep ^flags /proc/cpuinfo | grep -qw pti; then
|
|
||||||
# vanilla PTI patch sets the 'pti' flag in cpuinfo
|
|
||||||
kpti_enabled=1
|
|
||||||
elif grep ^flags /proc/cpuinfo | grep -qw kaiser; then
|
|
||||||
# kernel line 4.9 sets the 'kaiser' flag in cpuinfo
|
|
||||||
kpti_enabled=1
|
|
||||||
elif [ -e /sys/kernel/debug/x86/pti_enabled ]; then
|
|
||||||
# RedHat Backport creates a dedicated file, see https://access.redhat.com/articles/3311301
|
|
||||||
kpti_enabled=$(cat /sys/kernel/debug/x86/pti_enabled 2>/dev/null)
|
|
||||||
elif dmesg | grep -Eq 'Kernel/User page tables isolation: enabled|Kernel page table isolation enabled'; then
|
|
||||||
# if we can't find the flag, grep in dmesg
|
|
||||||
kpti_enabled=1
|
|
||||||
else
|
|
||||||
kpti_enabled=0
|
|
||||||
fi
|
|
||||||
if [ "$kpti_enabled" = 1 ]; then
|
|
||||||
pstatus green YES
|
|
||||||
else
|
|
||||||
pstatus red NO
|
|
||||||
fi
|
|
||||||
|
|
||||||
if [ "$mounted_debugfs" = 1 ]; then
|
|
||||||
# umount debugfs if we did mount it ourselves
|
|
||||||
umount /sys/kernel/debug
|
|
||||||
fi
|
|
||||||
|
|
||||||
/bin/echo -ne "> \033[46m\033[30mSTATUS:\033[0m "
|
|
||||||
if grep -q AMD /proc/cpuinfo; then
|
|
||||||
pstatus green "NOT VULNERABLE" "your CPU is not vulnerable as per the vendor"
|
|
||||||
elif [ "$kpti_enabled" = 1 ]; then
|
|
||||||
pstatus green "NOT VULNERABLE" "PTI mitigates the vulnerability"
|
|
||||||
else
|
|
||||||
pstatus red "VULNERABLE" "PTI is needed to mitigate the vulnerability"
|
|
||||||
fi
|
|
||||||
|
|
||||||
/bin/echo
|
|
||||||
|
|
||||||
[ -n "$vmlinux" -a -f "$vmlinux" ] && rm -f "$vmlinux"
|
|
||||||
@@ -0,0 +1,824 @@
|
|||||||
|
{
|
||||||
|
"aliases": {
|
||||||
|
"CVE-2018-3615": "CVE-2018-3646",
|
||||||
|
"CVE-2018-3620": "CVE-2018-3646",
|
||||||
|
"CVE-2025-54505": "https://www.phoronix.com/news/AMD-FP-DSS-Zen-1-Bug",
|
||||||
|
"CVE-2026-33691": "https://seclists.org/oss-sec/2026/q2/175",
|
||||||
|
"CVE-2026-41113": "https://seclists.org/oss-sec/2026/q2/176",
|
||||||
|
"CVE-2026-4519": "CVE-2026-4786",
|
||||||
|
"https://msrc.microsoft.com/update-guide/vulnerability/CVE-2026-33055": "CVE-2026-33055",
|
||||||
|
"https://msrc.microsoft.com/update-guide/vulnerability/CVE-2026-33056": "CVE-2026-33056",
|
||||||
|
"https://msrc.microsoft.com/update-guide/vulnerability/CVE-2026-4786": "CVE-2026-4786",
|
||||||
|
"https://msrc.microsoft.com/update-guide/vulnerability/CVE-2026-5160": "CVE-2026-5160",
|
||||||
|
"https://msrc.microsoft.com/update-guide/vulnerability/CVE-2026-6100": "CVE-2026-6100",
|
||||||
|
"https://seclists.org/oss-sec/2026/q2/173": "CVE-2026-33691",
|
||||||
|
"https://seclists.org/oss-sec/2026/q2/176": "CVE-2026-41113",
|
||||||
|
"https://transient.fail/": "CVE-2019-11091"
|
||||||
|
},
|
||||||
|
"last_run": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"schema_version": 2,
|
||||||
|
"seen": {
|
||||||
|
"AMD-SB-7050": {
|
||||||
|
"bucket": "tocheck",
|
||||||
|
"first_seen": "2026-04-18T14:24:43Z",
|
||||||
|
"seen_at": "2026-04-18T14:24:43Z",
|
||||||
|
"sources": [
|
||||||
|
"amd-psirt"
|
||||||
|
],
|
||||||
|
"title": "",
|
||||||
|
"urls": []
|
||||||
|
},
|
||||||
|
"AMD-SB-7053": {
|
||||||
|
"bucket": "toimplement",
|
||||||
|
"first_seen": "2026-04-18T14:24:43Z",
|
||||||
|
"seen_at": "2026-04-18T14:24:43Z",
|
||||||
|
"sources": [
|
||||||
|
"amd-psirt"
|
||||||
|
],
|
||||||
|
"title": "",
|
||||||
|
"urls": []
|
||||||
|
},
|
||||||
|
"CVE-2017-5715": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"seen_at": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"sources": [
|
||||||
|
"transient-fail"
|
||||||
|
],
|
||||||
|
"title": "CVE-2017-5715",
|
||||||
|
"urls": [
|
||||||
|
"https://transient.fail/"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"CVE-2017-5753": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"seen_at": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"sources": [
|
||||||
|
"transient-fail"
|
||||||
|
],
|
||||||
|
"title": "CVE-2017-5753",
|
||||||
|
"urls": [
|
||||||
|
"https://transient.fail/"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"CVE-2017-5754": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"seen_at": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"sources": [
|
||||||
|
"transient-fail"
|
||||||
|
],
|
||||||
|
"title": "CVE-2017-5754",
|
||||||
|
"urls": [
|
||||||
|
"https://transient.fail/"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"CVE-2018-12126": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"seen_at": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"sources": [
|
||||||
|
"transient-fail"
|
||||||
|
],
|
||||||
|
"title": "CVE-2018-12126",
|
||||||
|
"urls": [
|
||||||
|
"https://transient.fail/"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"CVE-2018-12127": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"seen_at": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"sources": [
|
||||||
|
"transient-fail"
|
||||||
|
],
|
||||||
|
"title": "CVE-2018-12127",
|
||||||
|
"urls": [
|
||||||
|
"https://transient.fail/"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"CVE-2018-12130": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"seen_at": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"sources": [
|
||||||
|
"transient-fail"
|
||||||
|
],
|
||||||
|
"title": "CVE-2018-12130",
|
||||||
|
"urls": [
|
||||||
|
"https://transient.fail/"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"CVE-2018-3639": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"seen_at": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"sources": [
|
||||||
|
"transient-fail"
|
||||||
|
],
|
||||||
|
"title": "CVE-2018-3639",
|
||||||
|
"urls": [
|
||||||
|
"https://transient.fail/"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"CVE-2018-3640": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"seen_at": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"sources": [
|
||||||
|
"transient-fail"
|
||||||
|
],
|
||||||
|
"title": "CVE-2018-3640",
|
||||||
|
"urls": [
|
||||||
|
"https://transient.fail/"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"CVE-2018-3646": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"seen_at": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"sources": [
|
||||||
|
"transient-fail"
|
||||||
|
],
|
||||||
|
"title": "CVE-2018-3615",
|
||||||
|
"urls": [
|
||||||
|
"https://transient.fail/"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"CVE-2018-3665": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"seen_at": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"sources": [
|
||||||
|
"transient-fail"
|
||||||
|
],
|
||||||
|
"title": "CVE-2018-3665",
|
||||||
|
"urls": [
|
||||||
|
"https://transient.fail/"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"CVE-2019-11091": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"seen_at": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"sources": [
|
||||||
|
"transient-fail"
|
||||||
|
],
|
||||||
|
"title": "CVE-2019-11091",
|
||||||
|
"urls": [
|
||||||
|
"https://transient.fail/"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"CVE-2019-11135": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"seen_at": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"sources": [
|
||||||
|
"transient-fail"
|
||||||
|
],
|
||||||
|
"title": "CVE-2019-11135",
|
||||||
|
"urls": [
|
||||||
|
"https://transient.fail/"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"CVE-2025-66335": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-18T14:24:43Z",
|
||||||
|
"seen_at": "2026-04-18T14:24:43Z",
|
||||||
|
"sources": [
|
||||||
|
"oss-sec"
|
||||||
|
],
|
||||||
|
"title": "",
|
||||||
|
"urls": []
|
||||||
|
},
|
||||||
|
"CVE-2026-25917": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-18T14:24:43Z",
|
||||||
|
"seen_at": "2026-04-18T14:24:43Z",
|
||||||
|
"sources": [
|
||||||
|
"oss-sec"
|
||||||
|
],
|
||||||
|
"title": "",
|
||||||
|
"urls": []
|
||||||
|
},
|
||||||
|
"CVE-2026-30898": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-18T14:24:43Z",
|
||||||
|
"seen_at": "2026-04-18T14:24:43Z",
|
||||||
|
"sources": [
|
||||||
|
"oss-sec"
|
||||||
|
],
|
||||||
|
"title": "",
|
||||||
|
"urls": []
|
||||||
|
},
|
||||||
|
"CVE-2026-30912": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-18T14:24:43Z",
|
||||||
|
"seen_at": "2026-04-18T14:24:43Z",
|
||||||
|
"sources": [
|
||||||
|
"oss-sec"
|
||||||
|
],
|
||||||
|
"title": "",
|
||||||
|
"urls": []
|
||||||
|
},
|
||||||
|
"CVE-2026-32228": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-18T14:24:43Z",
|
||||||
|
"seen_at": "2026-04-18T14:24:43Z",
|
||||||
|
"sources": [
|
||||||
|
"oss-sec"
|
||||||
|
],
|
||||||
|
"title": "",
|
||||||
|
"urls": []
|
||||||
|
},
|
||||||
|
"CVE-2026-32690": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-18T14:24:43Z",
|
||||||
|
"seen_at": "2026-04-18T14:24:43Z",
|
||||||
|
"sources": [
|
||||||
|
"oss-sec"
|
||||||
|
],
|
||||||
|
"title": "",
|
||||||
|
"urls": []
|
||||||
|
},
|
||||||
|
"CVE-2026-33055": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"seen_at": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"sources": [
|
||||||
|
"msrc"
|
||||||
|
],
|
||||||
|
"title": "CVE-2026-33055 tar-rs incorrectly ignores PAX size headers if header size is nonzero",
|
||||||
|
"urls": [
|
||||||
|
"https://msrc.microsoft.com/update-guide/vulnerability/CVE-2026-33055"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"CVE-2026-33056": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"seen_at": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"sources": [
|
||||||
|
"msrc"
|
||||||
|
],
|
||||||
|
"title": "CVE-2026-33056 tar-rs: unpack_in can chmod arbitrary directories by following symlinks",
|
||||||
|
"urls": [
|
||||||
|
"https://msrc.microsoft.com/update-guide/vulnerability/CVE-2026-33056"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"CVE-2026-33691": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"seen_at": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"sources": [
|
||||||
|
"oss-sec"
|
||||||
|
],
|
||||||
|
"title": "Re: [CVE-2026-33691] OWASP CRS whitespace padding bypass vulnerability",
|
||||||
|
"urls": [
|
||||||
|
"https://seclists.org/oss-sec/2026/q2/173",
|
||||||
|
"https://seclists.org/oss-sec/2026/q2/174",
|
||||||
|
"https://seclists.org/oss-sec/2026/q2/175"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"CVE-2026-39314": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-18T14:24:43Z",
|
||||||
|
"seen_at": "2026-04-18T14:24:43Z",
|
||||||
|
"sources": [
|
||||||
|
"oss-sec"
|
||||||
|
],
|
||||||
|
"title": "",
|
||||||
|
"urls": []
|
||||||
|
},
|
||||||
|
"CVE-2026-40170": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-18T14:24:43Z",
|
||||||
|
"seen_at": "2026-04-18T14:24:43Z",
|
||||||
|
"sources": [
|
||||||
|
"oss-sec"
|
||||||
|
],
|
||||||
|
"title": "",
|
||||||
|
"urls": []
|
||||||
|
},
|
||||||
|
"CVE-2026-40948": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-18T14:24:43Z",
|
||||||
|
"seen_at": "2026-04-18T14:24:43Z",
|
||||||
|
"sources": [
|
||||||
|
"oss-sec"
|
||||||
|
],
|
||||||
|
"title": "",
|
||||||
|
"urls": []
|
||||||
|
},
|
||||||
|
"CVE-2026-41113": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"seen_at": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"sources": [
|
||||||
|
"oss-sec"
|
||||||
|
],
|
||||||
|
"title": "CVE-2026-41113: RCE in sagredo fork of qmail",
|
||||||
|
"urls": [
|
||||||
|
"https://seclists.org/oss-sec/2026/q2/176"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"CVE-2026-41254": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-18T14:24:43Z",
|
||||||
|
"seen_at": "2026-04-18T14:24:43Z",
|
||||||
|
"sources": [
|
||||||
|
"oss-sec"
|
||||||
|
],
|
||||||
|
"title": "",
|
||||||
|
"urls": []
|
||||||
|
},
|
||||||
|
"CVE-2026-4786": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"seen_at": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"sources": [
|
||||||
|
"msrc"
|
||||||
|
],
|
||||||
|
"title": "CVE-2026-4786 Incomplete mitigation of CVE-2026-4519, %action expansion for command injection to webbrowser.open()",
|
||||||
|
"urls": [
|
||||||
|
"https://msrc.microsoft.com/update-guide/vulnerability/CVE-2026-4786"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"CVE-2026-5160": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"seen_at": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"sources": [
|
||||||
|
"msrc"
|
||||||
|
],
|
||||||
|
"title": "CVE-2026-5160",
|
||||||
|
"urls": [
|
||||||
|
"https://msrc.microsoft.com/update-guide/vulnerability/CVE-2026-5160"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"CVE-2026-6100": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"seen_at": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"sources": [
|
||||||
|
"msrc"
|
||||||
|
],
|
||||||
|
"title": "CVE-2026-6100 Use-after-free in lzma.LZMADecompressor, bz2.BZ2Decompressor, and gzip.GzipFile after re-use under memory pressure",
|
||||||
|
"urls": [
|
||||||
|
"https://msrc.microsoft.com/update-guide/vulnerability/CVE-2026-6100"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"https://lwn.net/Articles/1066156/": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-18T14:24:43Z",
|
||||||
|
"seen_at": "2026-04-18T14:24:43Z",
|
||||||
|
"sources": [
|
||||||
|
"lwn"
|
||||||
|
],
|
||||||
|
"title": "",
|
||||||
|
"urls": [
|
||||||
|
"https://lwn.net/Articles/1066156/"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"https://lwn.net/Articles/1067029/": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-18T14:24:43Z",
|
||||||
|
"seen_at": "2026-04-18T14:24:43Z",
|
||||||
|
"sources": [
|
||||||
|
"lwn"
|
||||||
|
],
|
||||||
|
"title": "",
|
||||||
|
"urls": [
|
||||||
|
"https://lwn.net/Articles/1067029/"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"https://lwn.net/Articles/1068400/": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-18T14:24:43Z",
|
||||||
|
"seen_at": "2026-04-18T14:24:43Z",
|
||||||
|
"sources": [
|
||||||
|
"lwn"
|
||||||
|
],
|
||||||
|
"title": "",
|
||||||
|
"urls": [
|
||||||
|
"https://lwn.net/Articles/1068400/"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"https://lwn.net/Articles/1068473/": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"seen_at": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"sources": [
|
||||||
|
"lwn"
|
||||||
|
],
|
||||||
|
"title": "Seven stable kernels for Saturday",
|
||||||
|
"urls": [
|
||||||
|
"https://lwn.net/Articles/1068473/"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"https://seclists.org/oss-sec/2026/q2/164": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-18T14:24:43Z",
|
||||||
|
"seen_at": "2026-04-18T14:24:43Z",
|
||||||
|
"sources": [
|
||||||
|
"oss-sec"
|
||||||
|
],
|
||||||
|
"title": "",
|
||||||
|
"urls": [
|
||||||
|
"https://seclists.org/oss-sec/2026/q2/164"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"https://seclists.org/oss-sec/2026/q2/167": {
|
||||||
|
"bucket": "toimplement",
|
||||||
|
"first_seen": "2026-04-18T14:24:43Z",
|
||||||
|
"seen_at": "2026-04-18T14:24:43Z",
|
||||||
|
"sources": [
|
||||||
|
"oss-sec"
|
||||||
|
],
|
||||||
|
"title": "",
|
||||||
|
"urls": [
|
||||||
|
"https://seclists.org/oss-sec/2026/q2/167"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"https://seclists.org/oss-sec/2026/q2/169": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-18T14:24:43Z",
|
||||||
|
"seen_at": "2026-04-18T14:24:43Z",
|
||||||
|
"sources": [
|
||||||
|
"oss-sec"
|
||||||
|
],
|
||||||
|
"title": "",
|
||||||
|
"urls": [
|
||||||
|
"https://seclists.org/oss-sec/2026/q2/169"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"https://seclists.org/oss-sec/2026/q2/170": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-18T14:24:43Z",
|
||||||
|
"seen_at": "2026-04-18T14:24:43Z",
|
||||||
|
"sources": [
|
||||||
|
"oss-sec"
|
||||||
|
],
|
||||||
|
"title": "",
|
||||||
|
"urls": [
|
||||||
|
"https://seclists.org/oss-sec/2026/q2/170"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"https://seclists.org/oss-sec/2026/q2/171": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-18T14:24:43Z",
|
||||||
|
"seen_at": "2026-04-18T14:24:43Z",
|
||||||
|
"sources": [
|
||||||
|
"oss-sec"
|
||||||
|
],
|
||||||
|
"title": "",
|
||||||
|
"urls": [
|
||||||
|
"https://seclists.org/oss-sec/2026/q2/171"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"https://seclists.org/oss-sec/2026/q2/172": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-19T09:01:57Z",
|
||||||
|
"seen_at": "2026-04-19T09:01:57Z",
|
||||||
|
"sources": [
|
||||||
|
"oss-sec"
|
||||||
|
],
|
||||||
|
"title": "",
|
||||||
|
"urls": [
|
||||||
|
"https://seclists.org/oss-sec/2026/q2/172"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"https://seclists.org/oss-sec/2026/q2/173": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-19T09:01:57Z",
|
||||||
|
"seen_at": "2026-04-19T09:01:57Z",
|
||||||
|
"sources": [
|
||||||
|
"oss-sec"
|
||||||
|
],
|
||||||
|
"title": "",
|
||||||
|
"urls": [
|
||||||
|
"https://seclists.org/oss-sec/2026/q2/173"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"https://seclists.org/oss-sec/2026/q2/174": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-19T09:01:57Z",
|
||||||
|
"seen_at": "2026-04-19T09:01:57Z",
|
||||||
|
"sources": [
|
||||||
|
"oss-sec"
|
||||||
|
],
|
||||||
|
"title": "",
|
||||||
|
"urls": [
|
||||||
|
"https://seclists.org/oss-sec/2026/q2/174"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"https://seclists.org/oss-sec/2026/q2/175": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-19T09:01:57Z",
|
||||||
|
"seen_at": "2026-04-19T09:01:57Z",
|
||||||
|
"sources": [
|
||||||
|
"oss-sec"
|
||||||
|
],
|
||||||
|
"title": "",
|
||||||
|
"urls": [
|
||||||
|
"https://seclists.org/oss-sec/2026/q2/175"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"https://seclists.org/oss-sec/2026/q2/176": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-19T09:01:57Z",
|
||||||
|
"seen_at": "2026-04-19T09:01:57Z",
|
||||||
|
"sources": [
|
||||||
|
"oss-sec"
|
||||||
|
],
|
||||||
|
"title": "",
|
||||||
|
"urls": [
|
||||||
|
"https://seclists.org/oss-sec/2026/q2/176"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"https://www.phoronix.com/news/AMD-2026-New-SMCA-Bank-Types": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"seen_at": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"sources": [
|
||||||
|
"phoronix"
|
||||||
|
],
|
||||||
|
"title": "Linux 7.1 Adds New AMD SMCA Bank Types, Presumably For Upcoming EPYC Venice",
|
||||||
|
"urls": [
|
||||||
|
"https://www.phoronix.com/news/AMD-2026-New-SMCA-Bank-Types"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"https://www.phoronix.com/news/AMD-FP-DSS-Zen-1-Bug": {
|
||||||
|
"bucket": "toimplement",
|
||||||
|
"first_seen": "2026-04-18T14:24:43Z",
|
||||||
|
"seen_at": "2026-04-18T14:24:43Z",
|
||||||
|
"sources": [
|
||||||
|
"phoronix"
|
||||||
|
],
|
||||||
|
"title": "",
|
||||||
|
"urls": [
|
||||||
|
"https://www.phoronix.com/news/AMD-FP-DSS-Zen-1-Bug"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"https://www.phoronix.com/news/AMD-Harvested-GPUs-Linux": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"seen_at": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"sources": [
|
||||||
|
"phoronix"
|
||||||
|
],
|
||||||
|
"title": "Valve Developer Further Improves Old AMD GPUs: HD 7870 XT Finally Working On Linux",
|
||||||
|
"urls": [
|
||||||
|
"https://www.phoronix.com/news/AMD-Harvested-GPUs-Linux"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"https://www.phoronix.com/news/AMD-RDNA4m-RADV-ACO": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-18T14:24:43Z",
|
||||||
|
"seen_at": "2026-04-18T14:24:43Z",
|
||||||
|
"sources": [
|
||||||
|
"phoronix"
|
||||||
|
],
|
||||||
|
"title": "",
|
||||||
|
"urls": [
|
||||||
|
"https://www.phoronix.com/news/AMD-RDNA4m-RADV-ACO"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"https://www.phoronix.com/news/CachyOS-Super-Charged-Linux-7.0": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"seen_at": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"sources": [
|
||||||
|
"phoronix"
|
||||||
|
],
|
||||||
|
"title": "CachyOS Rolls Out A Super-Charged Linux 7.0 Kernel",
|
||||||
|
"urls": [
|
||||||
|
"https://www.phoronix.com/news/CachyOS-Super-Charged-Linux-7.0"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"https://www.phoronix.com/news/GNOME-Graphs-2.0-Maps-Transit": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-18T14:24:43Z",
|
||||||
|
"seen_at": "2026-04-18T14:24:43Z",
|
||||||
|
"sources": [
|
||||||
|
"phoronix"
|
||||||
|
],
|
||||||
|
"title": "",
|
||||||
|
"urls": [
|
||||||
|
"https://www.phoronix.com/news/GNOME-Graphs-2.0-Maps-Transit"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"https://www.phoronix.com/news/GhostBSD-26.1-R15.0p2": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"seen_at": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"sources": [
|
||||||
|
"phoronix"
|
||||||
|
],
|
||||||
|
"title": "GhostBSD 26.1 Now Based On FreeBSD 15.0, Switches to XLibre X Server",
|
||||||
|
"urls": [
|
||||||
|
"https://www.phoronix.com/news/GhostBSD-26.1-R15.0p2"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"https://www.phoronix.com/news/KDE-Plasma-6.7-Session": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-18T14:24:43Z",
|
||||||
|
"seen_at": "2026-04-18T14:24:43Z",
|
||||||
|
"sources": [
|
||||||
|
"phoronix"
|
||||||
|
],
|
||||||
|
"title": "",
|
||||||
|
"urls": [
|
||||||
|
"https://www.phoronix.com/news/KDE-Plasma-6.7-Session"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"https://www.phoronix.com/news/Linux-7.1-Block-Changes": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"seen_at": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"sources": [
|
||||||
|
"phoronix"
|
||||||
|
],
|
||||||
|
"title": "Linux 7.1 Sees RAID Fixes, IO_uring Enhancements",
|
||||||
|
"urls": [
|
||||||
|
"https://www.phoronix.com/news/Linux-7.1-Block-Changes"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"https://www.phoronix.com/news/Linux-7.1-Crypto-QAT-Zstd": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"seen_at": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"sources": [
|
||||||
|
"phoronix"
|
||||||
|
],
|
||||||
|
"title": "Intel QAT Zstd, QAT Gen6 Improvements Merged For Linux 7.1",
|
||||||
|
"urls": [
|
||||||
|
"https://www.phoronix.com/news/Linux-7.1-Crypto-QAT-Zstd"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"https://www.phoronix.com/news/Linux-7.1-HRTIMER-Overhaul": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-18T14:24:43Z",
|
||||||
|
"seen_at": "2026-04-18T14:24:43Z",
|
||||||
|
"sources": [
|
||||||
|
"phoronix"
|
||||||
|
],
|
||||||
|
"title": "",
|
||||||
|
"urls": [
|
||||||
|
"https://www.phoronix.com/news/Linux-7.1-HRTIMER-Overhaul"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"https://www.phoronix.com/news/Linux-7.1-New-NTFS-Driver": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-18T14:24:43Z",
|
||||||
|
"seen_at": "2026-04-18T14:24:43Z",
|
||||||
|
"sources": [
|
||||||
|
"phoronix"
|
||||||
|
],
|
||||||
|
"title": "",
|
||||||
|
"urls": [
|
||||||
|
"https://www.phoronix.com/news/Linux-7.1-New-NTFS-Driver"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"https://www.phoronix.com/news/Linux-7.1-Scheduler": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-18T14:24:43Z",
|
||||||
|
"seen_at": "2026-04-18T14:24:43Z",
|
||||||
|
"sources": [
|
||||||
|
"phoronix"
|
||||||
|
],
|
||||||
|
"title": "",
|
||||||
|
"urls": [
|
||||||
|
"https://www.phoronix.com/news/Linux-7.1-Scheduler"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"https://www.phoronix.com/news/Linux-7.1-Sound": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"seen_at": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"sources": [
|
||||||
|
"phoronix"
|
||||||
|
],
|
||||||
|
"title": "Linux 7.1 Sound Code Adds Bus Keepers: Aiming For Better Apple Silicon Support",
|
||||||
|
"urls": [
|
||||||
|
"https://www.phoronix.com/news/Linux-7.1-Sound"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"https://www.phoronix.com/news/Wine-11.7-Released": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-18T14:24:43Z",
|
||||||
|
"seen_at": "2026-04-18T14:24:43Z",
|
||||||
|
"sources": [
|
||||||
|
"phoronix"
|
||||||
|
],
|
||||||
|
"title": "",
|
||||||
|
"urls": [
|
||||||
|
"https://www.phoronix.com/news/Wine-11.7-Released"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"https://www.phoronix.com/news/WireGuard-For-Windows-1.0": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"seen_at": "2026-04-19T14:06:07.928573+00:00",
|
||||||
|
"sources": [
|
||||||
|
"phoronix"
|
||||||
|
],
|
||||||
|
"title": "WireGuard For Windows Reaches v1.0",
|
||||||
|
"urls": [
|
||||||
|
"https://www.phoronix.com/news/WireGuard-For-Windows-1.0"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"https://www.phoronix.com/review/ubuntu-2604-xe2-lunar-lake": {
|
||||||
|
"bucket": "unrelated",
|
||||||
|
"first_seen": "2026-04-18T14:24:43Z",
|
||||||
|
"seen_at": "2026-04-18T14:24:43Z",
|
||||||
|
"sources": [
|
||||||
|
"phoronix"
|
||||||
|
],
|
||||||
|
"title": "",
|
||||||
|
"urls": [
|
||||||
|
"https://www.phoronix.com/review/ubuntu-2604-xe2-lunar-lake"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"sources": {
|
||||||
|
"amd-psirt": {
|
||||||
|
"last_fetched_at": "2026-04-19T14:02:00.309888+00:00",
|
||||||
|
"last_modified": "Sun, 19 Apr 2026 11:14:54 GMT",
|
||||||
|
"last_status": 200
|
||||||
|
},
|
||||||
|
"arm-spec": {
|
||||||
|
"etag": "\"c31f3bde81531617e355836b0f44bb05:1775559058.494352\"",
|
||||||
|
"last_fetched_at": "2026-04-19T14:02:00.309888+00:00",
|
||||||
|
"last_modified": "Tue, 07 Apr 2026 10:50:58 GMT",
|
||||||
|
"last_status": 200
|
||||||
|
},
|
||||||
|
"cert-cc": {
|
||||||
|
"last_fetched_at": "2026-04-19T14:02:00.309888+00:00",
|
||||||
|
"last_modified": "Fri, 17 Apr 2026 16:57:16 GMT",
|
||||||
|
"last_status": 200
|
||||||
|
},
|
||||||
|
"cisa": {
|
||||||
|
"last_fetched_at": "2026-04-19T14:02:00.309888+00:00",
|
||||||
|
"last_status": 200
|
||||||
|
},
|
||||||
|
"comsec-eth": {
|
||||||
|
"etag": "W/\"ad4d6e03055d4fc084e06c1140e33311\"",
|
||||||
|
"last_fetched_at": "2026-04-19T14:02:00.309888+00:00",
|
||||||
|
"last_modified": "Fri, 17 Apr 2026 18:23:42 GMT",
|
||||||
|
"last_status": 200
|
||||||
|
},
|
||||||
|
"intel-psirt": {
|
||||||
|
"last_fetched_at": "2026-04-19T14:02:00.309888+00:00",
|
||||||
|
"last_status": 200
|
||||||
|
},
|
||||||
|
"lwn": {
|
||||||
|
"etag": "\"7a7f043e5c25da73032a33e230cd5adc23dce68d29b294dfdcaf96dcaf23a08c\"",
|
||||||
|
"hwm_id": "https://lwn.net/Articles/1068473/",
|
||||||
|
"hwm_published_at": "2026-04-18T15:48:08+00:00",
|
||||||
|
"last_fetched_at": "2026-04-19T14:02:00.309888+00:00",
|
||||||
|
"last_status": 200
|
||||||
|
},
|
||||||
|
"msrc": {
|
||||||
|
"etag": "\"0x8DE9DE3F08E081D\"",
|
||||||
|
"hwm_id": "CVE-2026-4786",
|
||||||
|
"hwm_published_at": "2026-04-19T08:01:53+00:00",
|
||||||
|
"last_fetched_at": "2026-04-19T14:02:00.309888+00:00",
|
||||||
|
"last_modified": "Sun, 19 Apr 2026 07:19:05 GMT",
|
||||||
|
"last_status": 200
|
||||||
|
},
|
||||||
|
"oss-sec": {
|
||||||
|
"etag": "\"3ac0-64fc0e1ee44d0\"",
|
||||||
|
"hwm_id": "CVE-2026-41113",
|
||||||
|
"hwm_published_at": "2026-04-18T19:12:07+00:00",
|
||||||
|
"last_fetched_at": "2026-04-19T14:02:00.309888+00:00",
|
||||||
|
"last_modified": "Sat, 18 Apr 2026 19:15:03 GMT",
|
||||||
|
"last_status": 200
|
||||||
|
},
|
||||||
|
"phoronix": {
|
||||||
|
"hwm_id": "https://www.phoronix.com/news/AMD-Harvested-GPUs-Linux",
|
||||||
|
"hwm_published_at": "2026-04-19T13:25:50+00:00",
|
||||||
|
"last_fetched_at": "2026-04-19T14:02:00.309888+00:00",
|
||||||
|
"last_status": 200
|
||||||
|
},
|
||||||
|
"project-zero": {
|
||||||
|
"last_fetched_at": "2026-04-19T14:02:00.309888+00:00",
|
||||||
|
"last_modified": "Tue, 31 Mar 2026 22:50:42 GMT",
|
||||||
|
"last_status": 200
|
||||||
|
},
|
||||||
|
"transient-fail": {
|
||||||
|
"etag": "W/\"67ab337f-158c5\"",
|
||||||
|
"hwm_id": null,
|
||||||
|
"hwm_published_at": null,
|
||||||
|
"last_fetched_at": "2026-04-19T14:02:00.309888+00:00",
|
||||||
|
"last_modified": "Tue, 11 Feb 2025 11:24:47 GMT",
|
||||||
|
"last_status": 200
|
||||||
|
},
|
||||||
|
"vusec": {
|
||||||
|
"etag": "W/\"6391ad5e2c03310cced577acfca52f46\"",
|
||||||
|
"last_fetched_at": "2026-04-19T14:02:00.309888+00:00",
|
||||||
|
"last_modified": "Mon, 16 Mar 2026 09:18:55 GMT",
|
||||||
|
"last_status": 200
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
Reference in New Issue
Block a user