Search This Blog

Wednesday, September 28, 2022

WaPo: Shots affect Periods and More...

From the Washington Post (September 27, 2022):

Women said coronavirus shots affect periods. New study shows they’re right.

A coronavirus vaccination can change the timing of when you get your period, according to research. For most people, the effect was temporary.

The article begins (underline my own): "Not long after the rollout of coronavirus vaccines last year, women around the country began posting on social media about what they believed was a strange side effect: changes to their periods.

Now, new research shows that many of the complaints were valid."

The linked study from the British Medical Journal via the US NIH:

Study confirms link between COVID-19 vaccination and temporary increase in menstrual cycle length

Large NIH-funded study included participants in North America and Europe.

Which points to

Menstrual changes after covid-19 vaccination

BMJ 2021; 374 doi: https://doi.org/10.1136/bmj.n2211 (Published 16 September 2021)

Cite this as: BMJ 2021;374:n2211

Which says "... Changes to periods and unexpected vaginal bleeding are not listed, but primary care clinicians and those working in reproductive health are increasingly approached by people who have experienced these events shortly after vaccination. More than 30 000 reports of these events had been made to MHRA’s yellow card surveillance scheme..."

And then there's this: 

Research Letter September 26, 2022

Detection of Messenger RNA COVID-19 Vaccines in Human Breast Milk

"... the initial messenger RNA (mRNA) vaccine clinical trials excluded several vulnerable groups, including young children and lactating individuals ... The Centers for Disease Control and Prevention recommends offering the COVID-19 mRNA vaccines to breastfeeding individuals, although the possible passage of vaccine mRNAs in breast milk resulting in infants’ exposure at younger than 6 months was not investigated ..."

and this:

COVID-19 Vaccines

Last Revision: September 19, 2022.

" ... breastfed infants may experience sleepiness, increased fussiness, fever, rash or self-limiting diarrhea, but no serious adverse effects have been reported "

and

"Only a small percentage of milk samples from women who received an mRNA vaccine contained trace amounts of mRNA."

and this:

Yet during the the pandemic we saw (from FOIA requests to the CDC and FDA) this:

BIDEN ADMINISTRATION

How the CDC Coordinated With Big Tech To Censor Americans

and a portion of an email where women's problems are actively dismissed as "misinformation" and links to related discussion groups taken down:

Notice the email date: May of 2021.

The CDC and FDA had no knowledge what so ever whether what they were censoring as "misinformation" was true or false.

In fact, according to this "Breast-feeding mothers, infants and pregnant women were in the exclusion criteria list for the phase III clinical trials of both Pfizer[3] and Moderna[4]."

They simply made it up as they went along.

You have a Constitutional right to treat yourself medically as I wrote in "The 'Right not to Smoke'"

C++ : Half Oyster, Half Carrot

C++ is a modern disaster.

Poor noobs can't understand memory management.  "It's too hard."

Nearly 70% of Chrome bugs are memory related (according to this).

Apparently experienced programmers don't understand it either.

Why is this?

Why is C++ off transmorgifying C++ into Lisp (wait until you see the REPL)?

Watching videos on various C++ conferences is horrifying.

"Igor, let's graft ranges onto for loops!"

Reminds me of the movie "Time Bandits" where Evil says "You are a very troublesome little fellow [ C++ ]. I think I should teach you one of my special lessons? What do you think, Robert? Benson? What would look nice? Half-warthog? Half-donkey? Half-oyster? Half-carrot?"

What would look nice? 

Half python, half lisp?

Half PL/I, half Haskell?

Half templates-as-turing-equivalent, half Fortran?

Forty years ago I worked for a company called Unilogic, Ltd. that sold a document system called Scribe (I think the bibliographic reference format - see this - can still be found today).

Scribe was written in Bliss (see https://www2.cs.arizona.edu/classes/cs520/spring06/bliss.pdf), "Bill's Language for Implementing Spiffy Systems" according to some.  It was used internally by Digital Equipment Corporation (Dec) to write things like VMS, etc. (see this paper on the history of Bliss).

When I arrived at Unilogic in 1981 James Gosling (of Java fame) had been working part time (while earning his PhD from Carnegie Mellon University) on technology to convert Bliss to C so Scribe could run on non-DEC platforms.  He also created his own version of emacs (we called it gosmacs).  As he moved on from CMU I inherited the problem of converting Scribe to C so that the Scribe market could be increased from only DEC VAX/VMS (which wasn't popular at universities).

At the time BSD unix on VAX was a hot area but, unfortunately, it only supported C and not Bliss as it wasn't a DEC product.

Eventually the Gosling translation process was replaced by technology I created and we were able to convert Scribes Bliss source code into Ada, C, and other languages.

To accomplish this we initially (a team of myself and perhaps six other people) created a Lisp-based translator process that consumed entire programs, processed them into an intermediate semantic tree containing not only the operators but also the comments and source sourced stored in an object oriented database form, and a "dumper" that wrote out the intermediate form as in some different language.

(I am glossing over many, many details here - after all it was forty years ago - but for now this is good enough.)

Our customers, among others, were Bell Labs (Fortran to C CAD system), Honeywell (PL/M  for AN/UGC-144), of course Scribe Systems (what Unilogic had become), NASA (Fortran) and Boeing.

One customer, Rational Systems I believe they were called, was building an Ada machine - a physical piece of hardware that executed some kind of compiled Ada.  They wanted Scribe to run there.

By this time the translation technology we had worked for Scribe fairly well, so well, in fact, that it was easy to "lift up" the semantic translation and implement a "virtual byte address machine" in the output of the translation.  The effect of this was to implement byte addressable memory as a giant Ada array and to translate operations on memory in the source language into array operations in Ada.  Surprisingly it worked well, the Ada version of Scribe passed all its acceptance tests and was shipped to the customer.


We scooped WASM by thirty five years.

There were all the same problems as you see with WASM - for example: what to do about runtime.  In those days C was not the language of choice - but the prevalence of it at universities caused new hires to want to go in the Unix/C direction away from old standards like Fortran.

I remember attending a Unix conference in Toronto the 1970's and joking with the guy selling Gosling's emacs that one day IBM would use Unix.  Little did we know...

So what's the point of this rambling?  How does it relate to the disaster of C++?

I learned a lot in this process about the structure and nature of programming.  I saw how compiler writers thought and I saw how users of those compilers thought.

The "semantic model" we had for universally representing programming languages became the way I thought.  I don't think in C++ or Lisp or Python.  I think in an abstract semantic model format and translate what I am thinking into the appropriate output language, just like the translator used to do.

What I see today with C++ is wrong.

Wrong because the more features, bells and whistles that get added take the language further and further away from someone's intent and understanding.

For example, take a C++ lambda.  A context and a function pointer.  Before there was C++ that's how you would implement the conceptual lambda - a struct for a context with a corresponding pointer to a function that used the struct for the bindings.

This is a hard thing for noobs to understand.  Lambdas are scary and mysterious because they don't follow the traditional sequential flow of other programming models.

Adding layers and layers of silly semantics (like '&' and '=') just make understanding and debugging harderThey don't make the ideas behind it simpler or easier to understand.  They obscure what's happening.  The compiler works harder, the output of the compiler is less efficient.

(C++ today relies on the 1ns cycle times available in today's processors.  Had processors not gained such efficiencies (a DEC PDP/11 cycle time was around 800ns for 16 bits) C++ would be a failure.

C++ is living on borrowed time.

I have worked on software products that spanned decades.  The most important things you can do is A) to write code so the dumbest person you can think of can understand it and B) make the tools like the debugger work as well as possible with the code so the dumb person can quickly and easily understand the codes operation.

Why?

Because that dumb person is going to be you...


Big Sky Big Dog Walk #516

 


Monday, September 26, 2022

Sunday, September 25, 2022

Big Sky Big Dog Walk #513

Shot with iPhone, 195° fisheye - cleaned up some in Premier Pro.

Trying to get the camera rig set up properly.



Big Sky Big Dog

Big dog walk... 

Daily walk - five (5) Great Pyrenees and one (1) miniature King Cavalier.

We walk every day and have since the puppies (Red (grey lead attached to me though hard to tell in this video), Charity (black lead) and Hope (blue lead)) were old enough to go (April 2021).  This is the 512th walk (about) with the entire crew.  Cookie (yellow lead) and Cho (hand held grey lead)
are the parents and have been walking for years.

Playing with cameras and lenses on subsequent walks.  Probably going to make a camera rig to make this easier.

This video shot in one take, no cuts, on an iPhone.




Friday, September 23, 2022

Washington County Oregon Sues FOIA Requestor

Interesting story here: Oregon records dispute plays out against backdrop of election fraud ‘big lie’

Washington County is seeking a court order to block the release of a file related to the May 2021 election

BY: JULIA SHUMWAY - JUNE 14, 2022 5:45 AM

Tim Sipple, a libertarian, filed a FOIA (Freedom of Information) request with Washington County, Oregon for "...the SQL, or Structured Query Language file, a set of code used to create or modify the contents of a database, for the test of the May 2021 election..." (from the article).

Initially the county pushed back with Oregonian law that the article says "exempts records that include trade secrets, a computer program developed or purchased by an agency or that could identify security measures, weaknesses or potential weaknesses."

Undeterred Sipple, "... appealed the county’s decision to Washington County District Attorney Kevin Barton, who in February concluded the files were public and ordered the county to provide them."

That didn't sit well with the county so they sued Sipple to prevent his collecting the FIOA records.

This was in June 2022.

The trial was held recently and, according to Dr D G Frank, who provided testimony, here are his takeaways from the trial (available here on MeWe):

"Oregon Case Take-Aways"

It was a very productive week in Oregon at the trial in Washington County! I testified in the case where the SoS and County Officials were demanding that a local citizen not publicly share an election system file that was given to him by county election officials (the backup file for the 2020 General Election).

Here are my take-aways:

Oregon election officials are technically incompetent (County Clerk, AD, SoS). They have little understanding of the systems they manage, and little understanding of how vulnerable and porous their network security is.

The Oregon AG admitted in formal court documents that their election systems are accessible wirelessly even when they are not connected to local networks.

The Washington County clerk (retiring after 33 yrs) testified that she could turn on the modems on her election machines (the ones that are supposedly "air-gapped") with a few mouse clicks and without a password. (After all, they are just an off-the-shelf Dell laptop.)

Washington County election networks are using older, long-ago-hacked, security protocols.

Social 'sciences' use different mathematics than standard mathematics and statistical methods. (Math is racist, and counting ballots can only be performed by someone who has been trained as a political scientist.)

Oregon voters are using ballots that are not anonymous.

Oregon election officials are afraid to let the public inspect their systems, files, and procedures. Instead of making their election systems more transparent and auditable, Oregon election officials are trying to hide everything. (Of course, that increases the public trust in their processes.)

After careful inspection of election systems, results, and dozens of meetings with election officials in 43 states, I am sad to say that most of the above points apply to every state.

We need to "Vote Amish." All paper, no machines. Single day of voting on paper ballots with photo ID and paper poll books. Hand counting and reporting of tallies in traceable, transparent, and auditable ways.

So let's summarize this testimony:

A. These people don't know anything significant about the technology they use to secure your Constitutional right to have your vote accurately count.

B. These systems are vulnerable to external wireless access.

C. Passwords are not required.

D. These people are afraid you will find out the truth about what they do and how they do it.

E. They will resort to suing people to protect their incompetence.

F. Based on Frank's involvement in forty three (43) other states most of the above points apply to every state.

I suppose an actual trial transcript will eventually be available.

Please continue believing what is being provided here is not true and what Facebook, Twitter, etc. (again, known liars in cahoots with the US government) are telling you the truth that everything is "just fine."

Friday, September 16, 2022

Facebook Admitting Government Collusion to Censor Speech

Senator @HawleyMO Grills Facebook Executive And Gets Him To Admit That They Have Coordinated With The Biden Administration To Censor Constitutionally Protected Speech

https://twitter.com/ColumbiaBugle/status/1570477997836173312?ref_src=twsrc%5Etfw%7Ctwcamp%5Etweetembed%7Ctwterm%5E1570477997836173312%7Ctwgr%5Ec58383125d5f988946af0bf369f03787e51af56b%7Ctwcon%5Es1_c10&ref_url=https%3A%2F%2Fnotthebee.com%2Farticle%2Fjosh-hawley-got-a-facebook-exec-to-admit-they-censor-constitutionally-protected-speech-at-the-behest-of-the-biden-administration-and-nobody-cares

C++ Decays into Lisp (Chapter 1)

I have been using Mathematica since about version 3.0 (roughly 1996).  I also have extensive Lisp experience beginning in the 1980's. I ported Portable Standard Lisp (https://github.com/blakemcbride/PSL and https://en.wikipedia.org/wiki/Portable_Standard_Lisp) to VMS for use at LEXEME Corporation.  LEXEME initially used Lisp to build what I considered to be a successful computer-language to computer-language translation systems (more on this company in future posts).

Mathematica is a Lisp-ish offshoot in terms of its core Math Kernel functionality.  The UI is otherworldly in terms of power when compared to most IDEs - particularly in the area of complex mathematical and textual formatting as well as 2D and 3D graphics functionality.  I have routinely used Mathematica and it's UI over the last few decades to construct many complex applications and prototypes.  (There is also there "Workbench" which is an Eclipse-based Mathematica IDE - I have not played with that.)

So, more or less, forty some years of lisp and lisp-like language experience.

Key features of these languages: lambda's, rule processing, programs represented as data, and all the other lisp-ish features, and, most importantly, lack of speed.

All that said, I have also used C professionally since about 1976 and C++ since the mid 1990's as well as many python, Java, JavaScript, and many more.

The reason for this post is the disastrous direction "C++" has taken over the years.  Disastrous you say?  Why C++ is becoming the greatest thing in programming since sliced bread!  How can you say this!

As a for instance, let's take a look at this video:

The first part describes the explosion in "pages in the standard" from 500-ish to thousands over the years (I smell design by committee).  Not a good start.

From the video

The next chunk of video describes code of the form:

    main() {
        std::cout << f() << '\n';
        std::cout << f() << '\n';
        std::cout << f() << '\n';
        std::cout << f() << '\n';
    }

With a lengthy discussion of various ways (lambdas, generators, static variables) to configure the function f() to return sequential values.

   static int f_val = 0;
    int f() {
        return f_val++;
    }

In the olden days the above function f() would accomplish this task.

However, the video takes you through ever more complex forms of the same functionality: first lambdas and finally a really complex "generator".

All that said the important take away is this: why keep adding ever more complex features to the language when the language already has the necessary features to accomplish the task?

The "olden days" version generates code that looks like this on an Intel processor (using clang):

    push    rbp
    mov     rbp, rsp
    mov     eax, dword ptr [rip + f_val]
    mov     ecx, eax
    add     ecx, 1
    mov     dword ptr [rip + f_val], ecx

    pop     rbp
    ret

(Taken from compiler explorer at https://godbolt.org/)

We can use a lambda (one of many possible ways) to do the same thing:

    int f_val = 0;
    auto f = [&]() {
        return f_val++;
    };

which produces this code:

    push    rbp
    mov     rbp, rsp
    mov     qword ptr [rbp - 8], rdi
    mov     rax, qword ptr [rbp - 8]
    mov     rcx, qword ptr [rax]
    mov     eax, dword ptr [rcx]
    mov     edx, eax
    add     edx, 1
    mov     dword ptr [rcx], edx
    pop     rbp
    ret

Here we see seven instruction instead of four (75% increase) and our language has progressed in complexity as measured by the standard maybe 50% (at least to the C++11 features used here).

Why is this okay?

Well, for one thing processors today are insanely fast compared to the 1990's...  (A PDP-11/20 used in the early 1970s for Unix had an instruction cycle time of perhaps 800ns, current processors have a cycle time of perhaps 1ns as well as many advanced features like caches, and so forth.) 

So we can be less efficient in our generated instructions and our thinking.  Yes, our thinking...

All the muddled C++ "standard" expansion is basically tied to the idea that it's "too hard" for random Joe programmer to manage memory on his own - either will malloc/free or new/delete.  Really?  Try and figure out some old, giant code base stuffed with "auto" that takes months to adequately decode before you can really make a knowledgable fix.  How is this better?

So, to solve the problem, we turn a memory pointer language into one that tries to get the programmer to structure the code in such a way as to not need memory pointers.

The other sob story is that programmers need "efficiency" - python and friends are too slow (being interpreted or garbage collected) - so, rather than re-evaluate our programming situation, we need to move features of these other languages into C++.

But let's weigh this against the real cost of software: maintenance over lifetime.  Instead of one way, albeit tedious or using explicit memory allocation, we instead have created a generation of standards "expansion" so that either we have a mix of forty plus years of "coding standard" mixed into a single application or a continuous "rewrite" process that requires the code base to be constantly "reworked" to remove old "standard" code and replace it with new, less efficient "standard code".

Today C++ aspires to have all the things that are "wrong" with lisp: auto instead of explicit types, impossible to understand code (using auto, for example, to hide sins which only a compiler may know), explosive error messaging (same as losing a parend in lisp), a thousand ways to do simple things, and on and on.

And some things that are not related to Lisp: an entire interpreted Turning-complete template model.

Things like GPUs, graphics card standards such as OpenGL, Vulkan make writing "portable" code impossible because the libraries for these things are more complex than the base application.

Let's not forget WASM.  The new architecture to take your old, creaking C++ application into the modern operating system of a browser.

What's next in C++, piping like in shells, oh wait, it's already there!

Perhaps a C++ REPL?

I can't wait!

I mentioned Mathematica at the start because, of all things, it does the best of bringing everything you might want to do when developing complex applications into one place: code, graphical output, text formatting, documentation, handing complex math.  You can build prototypes quickly and with minimal effort.  They will run on any supported machine.

It's certainly no panacea but it's the most effective thing out there in many ways.

I think Mathematica needs a more Atom/Julia-like environment (though sadly Atom is to be shelved in place of VSCode) and a better interface to the volume of libraries out there (rather than just their own).

Things like VSCode are nice but they never quite work on large code bases.

No, instead of this ridiculous decades-long C++ "standards" effort how about a global effort to design a real development environment that we, as humanity, can be proud of.

A language that has a debugger that always works.

An IDE and language that is designed to handle new hardware: GPU, AI.

A development environment that can handle large, complex at-scale systems.

A language that allows control over efficiency.

A language that runs "everywhere" - hardware, GPU, browser.

I think that the focus on the language is really unimportant to a large degree (sure it needs to be readable).  What is important is a real development environment that is portable, easy to use, helpful, works across all platforms and is designed to handle the life cycle of real software.



Sunday, September 11, 2022

Unbreakable Crypto (Computational One Time Pad)

Alice and Bob with to exchange totally secure messages which cannot be decode by any third party.

They create a web service that broadcasts (using EventSource) non-deterministic random number generator (NDRNG) blocks - pulled from a hardware NDRNG - (1,024, 2,048, etc.) of 8-bit integers or perhaps choose from a commercial service.


(Initially they may require some external verification that the service they are using is in fact the same, i.e., not hacked in some way.  This could be accomplished, for example, with a public key signing of the blocks.)


Each NDRNG block has a sequence number and is transmitted on a short, prescribed, time-based standard, predictable interval.


Alice and Bob each poses a computer capable of subscribing to this EventStream to capture blocks.  The computer will also have the ability to capturing both arbitrary web data as well.


While Alice and Bob could agree to use the NDRNG EventStream blocks directly at specific times as a one-time cryptographic pad obviously anyone else who happened to capture the same block(s) would be able to trivially decode their messages.


To avoid this problem Alice and Bob agree to exchange a specific type of program as a ‘key’.  The program has specific, unique properties:


1) It is “synchronous” with respect to both Alice and Bob in time such that information on its state of computations (data, program counter, program memory, blocks, altered blocks, etc.) is the same for both Alice and Bob over time, i.e., both their computer compute identical values and have identical state at identical times.


2) It consumes and retains some set of block data from the EventStream.


3) It performs operations on blocks in a “closed” manner: operations must not introduce data derived from non-block sources.  For example, treating two sections of two blocks as 32-bit integers and replacing one section with the sum of the two integers is valid because only block data is used.  The addition could generate a carry which, again for example, could be used to shift another block.  This is valid as the carry is the result of a block-only operation.  But adding an epoch time to a block, for example, in this way would be invalid.


4) The order and selection of operations within the program may be controlled by block data or be coordinated using some data synchronously gathered by both Alice and Bob.  For example, the selection of which 32-bit integers to use within which block and the ordering of these additions could be controlled by bits fetched from some portion of a Bitcoin ledger selected by the program (obviously the program would fetch arbitrarily many external data sources so tracking web operations would not easily reveal what data was actually used).  Further, block data can be used to alter, modify or add to the program over time as well as exclude blocks or other previously computed data from future computations.

  

5) At various synchronous points of operation the program extracts an arbitrary set of data form its state to create a “one-time pad.”  Pad creation may, for example, be performed by an algorithm that ensures a uniform distribution of one’s and zeros in the pad.  Bob and Alice exchange messages using these one time pads: message exchanges are synchronized in time to the operation of the program.


The idea here is that the internal state of the programs memory is a set of altered blocks of NDRNG data.


It would be impossible to monitor all data received by Bob or Alice and attempt all possible permutations of all possible operations on that data in order to guess the program state.


Pads are continuously generated over time and the only connection between Alice and Bob is the state of this computation.


Notes: https://www.design-reuse.com/articles/27050/true-randomness-in-cryptography.html

Tuesday, September 6, 2022

And the Winter will be Big...

Эпический газовый тролль (Epic Gas Troll...)

And no, it's not your Game of Thrones "winter is coming..." nonsense.

А зима будет большая…
Вот, гляди-ка, за рекой
Осень тихо умирает,
Машет желтою рукой.
Плачут мокрые осины,
Плачет дедушка Арбат,
Плачет синяя Россия,
Превратившись в листопад.
И, сугробы сокрушая,
Солнце брызнет по весне…
А зима будет большая —
Только сумерки да снег.

And the winter will be big ...
Here, look, across the river
Autumn is slowly dying
Waving yellow hand.
Wet aspens are crying,
Crying grandfather Arbat,
Crying blue Russia
Turned into leaf fall.
And, crushing the snowdrifts,
The sun shines in spring...
And the winter will be big -
Only dusk and snow.

Gazprom shuts of EU gas...  

And trolls the west by posting this fascinating video.  

I wonder if the EU will simply and magically turn down everyone's thermostat remotely like they did in Colorado...  And save the day.

Probably not.



Sunday, September 4, 2022

Embalming the Conformant

About a year ago I wrote "Autopsies of the Conformant" regarding changes doctors had seen in patients who had taken the jab.  Scary yet interesting stuff.

Today we find this article "Embalmers Have Been Finding Numerous Long, Fibrous Clots That Lack Post-Mortem Characteristics."  Certainly just another "coincidence" - after all, though these might be "professional" embalmers what do they know about "the narrative" the "science" anything.

According to the article, Mike Adams who runs an ISO-17025 accredited lab in Texas, using "inductively coupled plasma mass spectrometry (ICP-MS), triple quadrupole mass spectrometer, and liquid chromatography-mass spectrometry, usually testing food for metals, pesticides, and glyphosate" tested the clots.

We have tested one of the clots from embalmer Richard Hirschman, via ICP-MS. Also tested side by side, live human blood from an [ unjabbed ] person,” Adams told The Epoch Times.

He found that the clots are lacking key elements present in healthy human blood, such as iron, potassium, and magnesium, suggesting that they are formed from something other than blood. (Yet they are in the arteries.)

Prior to 2020, 2021, we probably would see somewhere between 5 to 10 percent of the bodies that we would embalm [having] blood clots,” Richard Hirschman, a licensed funeral in Alabama, told The Epoch Times.

We are familiar with what blood clots are, and we’ve had to deal with them over time,” he said.

He says that now, 50 percent to 70 percent of the bodies he sees have clots.

The analysis of the clots is interesting:

Element Blood             (Unjabbed) Clot (Jab Unknown)

Mg (Magnesium)         35 ppm         1.7 ppm

K (Potassium)                 1,893 ppm 12.5 ppm

Fe (Iron)                         462 ppm         20.6 ppm

Cu (Copper)                 1 ppm         0.3 ppm

Zn (Zinc)                         7.9 ppm         2.4 ppm

Al (Aluminum)                 1.3 ppm         1.6 ppm

Na (Sodium)                 1,050 ppm 1,500 ppm

C (Carbon)                 137,288 ppm 152,845 ppm

Ca (Calcium)                 74 ppm         23.8 ppm

Sn (Tin)                         163 ppb         943 ppb

Cl (Chlorine)                 930 ppm         290,000 ppm

P (Phosphorus)         1,130 ppm 4,900 ppm 

I find this fascinating, particularly the magnesium numbers.  I myself (and others close to me) have suffered leg cramping for many, many years (perhaps a decade pre-Covid).  No amount of stretches or other physical therapy-type actions ever produced any improvement.  However, adding magnesium to my daily vitamin regimen stopped them within the time it took for the magnesium to travel through my digestive system and reach my muscles.  According to Mrs. Wolf magnesium is needed to help muscles loosen after contraction.  Calcium also plays an important role in this process.

So the effect, without magnesium, is that your leg muscles become like a "pressure suit" you would find jet pilots wearing (these suits squeeze your lower body to keep blood in your brain so you don't pass out during high-G turns).  The leg muscles don't fully contract restraining blood flow - hence you end up with high blood pressure because your leg blood flow is restricted.

But, then again, this is not "the science" which cannot be questioned - must be just a "coincidence."

In this table we see a 20x reduction in magnesium in these clots inside the arterial blood vessels.

We also know about 68% of the US population according to this is jabbed (in one form or another).

Then there is the corresponding rise in "excess deaths" in the last few years (as an example: https://news.ambest.com/articlecontent.aspx?refnum=317540&altsrc=43).

Coincidence?

Of course.  Because fact and evidence that don't fit the "science" are to be ignored or hidden.

The Epoch Times article continues:  “In 20 years of embalming, I had never seen these white fibrous structures in the blood, nor have others in my field. In the past year, I have seen these strange clots in many different individuals, and it doesn’t seem to matter what they die of, they often have similar substances in their blood. This makes me very concerned because if something is wrong in the blood, it begs the question: is something causing people to die prematurely?” Hirschman said.

As the summer [of 2021] went on, COVID deaths were on the decline, but these clots were increasing in number. My suspicion is that the [ jab ] may be the cause of these strange clots. I realize that I am not a doctor nor am I a scientist, but I do know what blood looks like and I am very familiar with the embalming process that I have been doing for two decades. I do not know 100 percent what causes these clots, but I do know from my experience and through speaking with several other embalmers and funeral directors none of us had seen this strange clotting before.

Hirschman sent the clots to a few pathologists and claims that some of them have “overlooked” them, probably due to fear of retaliation.

He has embalmed thousands of bodies and is very familiar with blood, and he feels that the blood of most of the bodies he has seen in the last two years “has changed.

The Epoch Times article goes on to outline numerous others in the funeral/embalming field who have had similar experiences.

The article concludes with this: “The very large blood clots that are being removed before and after death are unlike anything we have ever seen in medicine,” Dr. J T Horp [ typos my own to evade the thought police - think US Olympic Gold Medalist in Track and Field Athlete in the 1930's], a maternal-fetal medicine expert who has been observing anomalies in pregnant women and fetuses, told The Epoch Times.

The COVID-19 vaccine diverts energy away from the physiologic processes in the body towards the production of the toxic spike protein. This directs energy away from the normal process of internal digestion also known as autophagy. This results in protein misfolding and propagation of large intravascular blood clots and also a variety of related diseases including prion disease, Creutzfeldt-Jakob disease, amyloidosis, and dementias including Alzheimer’s and others. While it is possible that COVID-19 illness in itself could potentially contribute to these diseases, it is unlikely and if so the effect of the vaccine would be 100 to a 1,000-fold greater than that of COVID-19 disease.” Horp said.

This is not limited to the US.  See "German COVID-19 Autopsy Data Show Thromboembolism, ‘Heavy’ Lungs" on German results for example (note: thrombosis occurs when blood clots block veins or arteries).

But these, I am sure, are merely coincidences...