The original research in this video is made possible by generous contributions from supporters of the Dr.SHIVA Truth Freedom Health® movement. Please contribute so we may continue to bring you such original research, valuable education, and innovative solutions.
In this Discussion, Dr. Shiva shares the results of his groundbreaking study revealing over 200,000 mail ballots with mismatched signatures were counted without being reviewed (“cured”) in Maricopa. This is the first study to calculate signature matching rates to provide a quantitative framework for assessing signature verification of mail ballots.
- Dr.SHIVA Ayyadurai, MIT PhD – Inventor of Email, Systems Scientist, engineer, educator – presents the first study to calculate Signature Matching Rates and to provide a quantitative framework for assessing Signature Verification of early voting mail ballots (EVBs).
- In Maricopa County, 1,911,918 EVBs were received and counted.
- The County reported no more than 25,000 of these ballots (1.3%) had signature mismatches and required review (“curing”); and of the 25,000, 2.3% in post-curing – 587 – were confirmed signature mismatches.
- A Pilot Study recruited three novices and three experts (forensic document examiners) to calculate signature matching rates on the same sample of 499 EVB envelopes. The purpose of this Pilot Study is to determine if results warrant any further investigation.
- All six reviewers who were presented images of EVB envelopes to evaluate if the signatures on those envelopes matched with genuine signatures on file concurred 60 of the 499 (12%) EVBs as signature mismatches.
- Based on this Study, over 204,430 early EVBs should have been cured vs. the 25,000 that the County actually cured; and, using the County’s 2.3% post-curing rate, 5,277 EVBs should have been disallowed.
- Though this Pilot Study is compelling on its own, an expanded study is warranted.
signatures, ballot, people, mismatch, cured, maricopa, study, signature verification, match, envelopes, novices, images, mail ballots, agreed, experts, election, pooled, reviewers, elections, study,
Hello, everyone, this is Dr. Shiva Ayyadurai. I hope you’re well. I’m going to be sharing with you a study that I did, a pilot study, which I shared some of yesterday, but I thought it would be good to redo it again today, because I think sometimes these kinds of things when they’re rushed, it’s not as good.
I wanted to go into a very detailed – on the pilot study, revealing that over 200,000 mail ballots in Maricopa, which had mismatched signatures, that’s what revealed in the study, were counted and weren’t fully reviewed. A full review is also noted as something called curing. Okay, so that’s what we’re going to go over today. And that’s what we’re going to discuss today.
So again, let me bring up the title of the talk today. As you can see, it’s called “Irreconcilable Differences: Over 200,000, mail ballots, with Mismatched Signatures Counted Without Being Reviewed”, which means cured, and you’re going to understand what that terminology means in Maricopa, and this is a first study, to the best of our knowledge, to calculate signature matching rates, to provide a quantitative framework for assessing signature verification of mail ballots.
The whole area of mail-in ballots obviously has become a very controversial area, it’s split up the country, there’s a lot of division. But one of the aspects of mail ballots that you’re going to learn shortly, is the aspect of verifying signatures. And we’re going to discuss that. So, today we’re going to – you’re going to learn about signature verification.
We’re going to go over the results that took place in the 2020 election in Maricopa what, how they verified signatures, what results they had, then we’re going to share with you two important experiments that we did, where we got novices, to do the same exercise of having side by side signatures, and them telling us whether the signatures matched or not from the same person.
And similarly, we did the same thing with what we call experts – Forensic Document Examiners, FDEs, they did the same thing. And in the last part of it, I’m going to talk about how we combined both results, contrasted them, but also pool them together to come up with the most conservative signature mismatch rate. And we’re going to go over that and that mismatch rate came to 12%.
When you multiply the 12% by all the signatures, all the early voting ballots that came in, our analysis shows that over 200,000 ballots had signature mismatches and they should have gone through this full review process called curing. In Maricopa however, only 25,000 ballots were cured. Okay. And our intention in doing this is to provide a scientific study so we could educate people.
And we hope that the election officials in Maricopa will engage in a discourse and discussion. But this study is being delivered to the Arizona Senate as well as the Attorney General. It was commissioned by the Arizona Senate for me to do so let me just jump right into it. And I want to welcome everyone for joining us. So let me go right into it. By the way, this is Dr. Shiva Ayyadurai. We completed this study on January 14, we presented it to the Arizona Senate liaison on January 16. We did the final report a few days ago and we did a public presentation recently, but I thought it’d be better to do it again.
I did it from home at the time, and I thought it’d be better to do it a little more formally because of the importance of this. Okay, so let me go right into this. By the way, notice, it’s a pilot study. A pilot study is typically done, where we have an idea, an inkling in science, something’s happening, so we want to explore it. But we go into it, obviously, with a hypothesis.
So by the way, if you want to know more about me, it’s all in there. My background is in engineering systems, over 40 years of work, and 40 plus years in pattern analysis and all of that stuff is in here. If you want to go to vashiva.com, if you want to know more about my background, please do that.
But this publication is a publication of the Election Systems Integrity Institute. It’s a newly launched Institute, ESII, and we’re dedicated to providing independent research and infrastructure so those of you who are interested in doing audits or want to learn, please go check out the institute. And it’s really to support election systems integrity. And this publication documents the work completed by EchoMail, which is commissioned by the Arizona State Senate to perform the work in this study.
Table of Content
So we have various items we’re going to cover today. We’re going to go through a quick executive summary, a background, the methodology that was used in Experiment One, Experiment Two, that combines analysis of novices and experts. We’re going to do that and then we’re gonna have a discussion, and then we’re going to go to conclusions.
Here is the executive summary. First, this is a first study to calculate signature miss – signature matching rates and to provide a quantitative framework for assessing signature verification of early voting mail ballots. EVBs. I’m going to use that acronym.
In Maricopa there were 1,911,918 early voting, mail ballots that were received and counted, and the county reported no more than 25 of these ballots 25,000 of these ballots 1.31% had signature mismatches, and required review and curing to resolve those signatures to see if there were indeed mismatches. Now our pilot study recruited three novices and three experts forensic document examiners to calculate signature matching rates on the same sample of 499 EVB envelopes.
So we took a sample. I’ll explain that. And the purpose of this pilot study is to determine if results warrant any further investigation. And all six reviewers who were presented images of EVB envelopes evaluate if the signature on those envelopes match with the genuine signatures on file, concurred 60 of those 499. So all six agreed that they were all mismatches. And they did it all independently. So that would be a 12% mismatch rate. Maricopa says 1.3%. Now based on the study, over 200,000, early voting ballots should have been cured, versus at most 25,000, that the county actually cured.
And though this pilot study is compelling, on its own, we believe an expanded study is warranted. Okay. So let’s go to the background. What is signature verification? Again, to quickly review this, for those of you that are new. And by the way, I didn’t know what a lot of this was until I got involved in this. So we’re all in the same boat, I’m just about a year and a half ahead of you. But signature verification is a multi-step process aimed to verify signatures based on review of two signatures, side by side; one being genuine, the other being questionable.
What Is Signature Verification?
And what happens is early mail, ballot envelopes come into a central facility they’re scanned to create these envelopes images, okay. And those are the early voting ballot envelopes that are scanned to produce images. Then what happens in step one, an initial review is done. It’s about 4 to 30 seconds using the county’s procedures, reviewers compare signatures on envelopes with genuine signatures on file to determine match or mismatch.
So they say, “Oh, does this match this”, okay? And by the way the people are trained in this in a very short period of time to do this, it’s novices who do this. And then if it is a no match, it goes to curing. So all these ballots come in the 1.9 million plus. And the ones that are if the people here say they don’t match, it goes into curing, which means more detailed review, if they match, the envelope is open, and it’s sent for tabulation. When it’s cured, curing means it takes more than just 4 seconds to 30 seconds. By the way, initial reviews only take 4 to 30 seconds, and curing takes 3 plus minutes.
And the investigation includes attempts to contact voters to determine if the signature is indeed a match, or no match. So and after curing if it’s confirmed as a no match and the ballots not counted. If it’s confirmed as a match, then it’s sent for tabulation. Okay, so basically the voters got two shots, one at the initial reviewing, and one during curing. Curing was set up really for fairness. Now in Maricopa County, let’s look at what they actually did. There were 1 million.
There were 1,000,911. actually 1,911,918 Okay, early voting ballots, as we said mail-in ballots to the county set a maximum number of 25,000 were cured and which is 1.31%. The confirmed signature matches, mismatches where EVBs were not counted.
According to accounting for 587, out of those 25,000. So if you compare the 587 to 25,000 that will be 2.3%. But if you compare the 587 net, which was not counted, that would be .031%, three, one hundredths of one percent. All right. So what did we do? So what we did, our team at ESII, in our research institute, we put forward a methodology, which I’m going to walk you through, and we did two experiments. Let me go right to that. So the methodology we did here was the following.
First, we selected a representative statistical sample from population 1,911,918 early voting, mail ballots, and we have those in here, as our EchoMail had them, to be specific. And the confidence level of these were 95%, we want to do – we wanted to select a sample size, we could have 95% confidence with a plus or minus a 4.4% margin of error. So out of those, all these millions of ballots, we selected 499 randomly.
And to give you an idea in statistics, you don’t have to select all of them, you just have to do a random sampling. This is how pollsters work. So again, this is a pilot study, we could do a much larger study, but even at this level of selection, we’re at 99-95% confidence with a plus or minus 4.4% error. Okay, so we selected a random sample of 499. That was the first part. And then we organized a data set of 499 envelope signatures, by randomly sampling that 1.9 plus million EVB envelope images repository we had, okay.
So these are for example, the signatures from those envelopes. Alright, step three was we created a data set of the 499 genuine signatures that match names and addresses of the 499 envelopes signatures, okay. Now, how do we do that? Well, we source, we source from the Maricopa publicly accessible deeds repository, we extracted 499 deed signatures. Now, it should be noted that the source of the genuine signatures used in this study are likely different from this source of genuine signatures used by the county.
They typically use voter registration signatures that are from the DMV. However, experts and forensic document examination share that signatures from a deeds repository, which is what we did, may likely be more valid given such signatures are notarized. Okay, so the county, by the way, if they’re listening, wishes to provide their genuine signatures, for these 499 samples, the study can be updated.
So we’re willing to do that. Now, Step Four was to create pairwise data sets of the 499 envelopes signatures, and the 499 genuine signatures. Here’s the signature from the deeds repository, and here’s the signature right off the envelopes. And what we asked our reviewers to do was they’re supposed to look at these and decide if this is a match, or no match. That’s simple, okay, and they can only spend a maximum of 30 seconds, 4 to 30 seconds in doing this. And that’s typically what’s allowed in the initial review process. Now, let’s go to Experiment One.
We use novices here, we use three novices. So the novices were asked, instructed to follow the county’s guide. So they went through it, reviewed it. In fact, I was one of them. Okay, I was one among the three. And the three non FDEs selected and instructed to follow the county’s guide. We presented 499 pairwise signatures, no more than 30 seconds, and everyone was recorded for their match or non match selections. And each of the non FDE’s mismatch rate was calculated, as well as what we call the pooled consensus mismatch rate. So one is each FDE’s mismatch rate, but then we looked at all these three people, we said, how many signatures are they all agree were mismatches.
Results of Experiment I
And that’s called pooled consensus. All right. So on that y – x axis are the signatures, because they’re going from signature zero all the way to signature 499. And you can see over time, how their signature mismatch rates are converging down to here. And you can see we also averaged all them and this average came to 20.1%. Okay, so 20.1% was the average of the mismatch rates, when you look at all the three non FDE, the novices.
And this is another way of looking at it. So you notice 86 out of 413, were no match for this non FDE and so on. These two guys are very close, but the total signature mismatch rate by the novices average is 20.1%. Remember, the county’s was only 1.3%. Okay, so just think about the county’s was only 1.3%. All right. Then we did a pooled consensus. This means how many times have all three non FDE novices for a pair of signatures concluded was a match, a no match or did not have agreement. And that’s what this looks like.
So you can see the saffron, here’s non match; the blue is how many times they matched. And the grays when, you know one guy matched, but two others didn’t, okay, because there’s three people here. And you can notice here 63 out of 499, or when all three people, all three, non FDEs, agreed that those signatures on those ballots were mismatches. So that would mean if you use this number, that the non FDE pooled consensus signature mismatch rate is 12.6%. Again, that’s close to almost 10 times higher than the Maricopa County signature mismatch rate. Okay. So again, these are the novices.
Experiment I Summary
So, in conclusion, from the first experiment, what we see is nearly 240,902 EVBs should have been cured, okay. 240,902. So remember, this was the original, the county had 1.3% to 25,000. If we use the average mismatch rate for the novices, this study reveals that 384,295 should have been cured. But if you take even the more conservative one, where we only assume the pooled consensus, that would be 240,902. The pooled consensus mismatch rate, okay.
Now, let’s go to Experiment Two. So, let me just jump in here. So the first experiment, is the experiment where we did novices, non forensic document examiners. Now we’re going to look at how do the experts do it? And these are experts who were hired by courts to look at signatures to determine if there are, they do forensics on them. So we hired three of those. And they helped us. Okay, so let’s go here.
So this is Experiment Two, we brought in experts. Again, the experts were given three of them for 499 signatures, pairwise images. They had to say it was a match or no match. Again, we calculated their mismatch rates and their average mismatch rates among all three of them, and then we also looked at their pool consensus mismatch rates. So again, x axis number of signatures going from left to right, they’re going and they’re processing them.
Results of Experiment II
And you can see that these two FDEs converge and this FDE is a little bit off, but, meaning away from the three, but they’re far higher than the novices. So they were when you average them at 60.3%. And then, when you again, here are the numbers that you can see, they’re pretty close – one of them, the first and the third are above 60. The middle one is that, you know, 40 plus.
But regardless, among the forensic document examiners, they said that 60.3% of the signatures mismatched, and so that in Maricopa said only 1.3%. Obviously, the forensic document examiners have a much higher standard than the criteria that the non FDEs were asked to follow. Again, just remind you now we did the pooled consensus, which means we looked at the FDEs and said, “How many of the signature pairs that they both conclude were match all three?
Are they all concluded was no match? Or where they all had disagreements?” And this is what you see. So you can see obviously, they had far more mismatches, 195 out of nearly the 499 they said were mismatches. They had 97 only where they matched, okay, and the rest they disagreed. So that, but regardless, what you will find is that this means the pool consensus signature mismatch rate is 39.1%.
Experiment II Summary
Again, far less than the percentage of Maricopa signature mismatch rates. Again, this is the pool consensus of the three FDEs. So if you use that pool consensus number, you find out that 747,560 EVB should have been cured. Okay. And what you see here is, again, comparing it to Maricopa 1.31% If you look at the high rate, if you look at the average rate 60.3% or 1,152,887 ballots should have been cured if you use the average, but if you use a more conservative number, of 39.1% where they all agreed on common signature mismatches.
That means 747,560 should have been cured. Okay. So, there you go, you have two very different numbers. So if you look at the novices, they’re seeing around 240,000. If you’re looking at the experts’ minimum they’re seeing around 740,000. But this is far more than the 25,000, which Maricopa cured. So that’s the main takeaway from this part so far. Now we’re going to do, is we’re going to now do an exercise where we look at bringing together the novices and the experts. Okay, and what would we get there?
Combined Analysis of Novices & Experts
So, combine analysis of novices and experts, novices meaning the non FDEs; experts, meaning the forensic document examiners. Again, this is all six. So the bottom here, right, here are the three novices. So, where our mismatch rates are very low, the experts are here, they’re much higher, as you can see. And then, we did the average of all the experts, which is the saffron there at 60.3%, the average of the non experts or the novice is 20.1.
But then if you average everyone together, that’s 40.2%, which means the novices and the experts are saying, if you pool them all together, that 40.2%, you can look at each individual FDEs and the average we have here. But again, 40.2% is also still higher than the Maricopa signature mismatch rate. My intention in doing this was to look at this problem in different ways. One is just the novices, but novices average, novices pooled, then just the experts, their average, and their pooled, and then put them all together. And that’s what we have here, the average of everyone put together, you can see is 40.2%. Still far higher than the 1.3%.
The next thing that I wanted to do was to say, if I brought all of the experts and novices together, and we looked at which ones they – how many of the ballots’ signatures did they have commonality of agreements. And that led to this bar chart. So the x axis here is how many times – how many reviewers all agreed to a number of EVBs being mismatched, common mismatches. So let me explain.
So, this means none of the reviewers agreed that 97 of the ballots were mismatches, which means all of these were matches, which means all agreed these signatures match. And so that was six agreeing all 97 signatures matches. And just to show you that when they do match here are the examples of some of those signature matches. So again, all six people, novices and experts agree this signature and the signature match, you know, you don’t, they look pretty similar.
Examples of Signatures Matches
And again, here’s another one, three novices and three experts agree that they both match. Three novices and three experts agree – these two match. Same here, that three novices, experts, these two match and also with this fifth example here. So those are examples of where novices and experts have both agreed these signatures match. And there were 97 of those, okay. But we’re interested in the mismatch rate, okay.
Now we want to look at this analysis here, which is the right bar here, where all six reviewers, how many times have they agreed to ballot signatures together pairwise did not match and those were 60 signatures. That’s what the 60 is where all three non FDEs and FDEs agreed that they did not match. So signature mismatches. All three novices and three experts. So what do we have here? We have examples of this. Okay, where three novice and experts agree they do not match. So, this and this do not match.
Calculating Signature Mismatch Rate
This, and this, do not match all three novice and expert agreed. And you can see they’re pretty clear they don’t match. Okay, same here. Very different slant, very different writing. And same here. So those are examples of where now we have 60 examples of where the novices and the experts agree that out of the 499, they don’t match. So what does – so that’s called the pooled consensus, signature matching rate among the common mismatch of novice and expert. So what is that? Well, if you look at this, it’s this.
Combined Analysis Summary
Again, the saffron sliver here, 12.0%. So 60 of the 499. Some of the examples we just shared with you, or both parties agree they don’t match. Okay? So that’s 12% pooled signature mismatch rate. And look at that, that’s still much higher than the 1.3% Alright, but if you use a 12% number, you find out 229,433…430 EVBs should have been cured. Okay. So again, comparing to Maricopa, which is 25,000 at 1.31%, if you use the pool mismatch rate of 12%, which would be the most conservative, you get 229,430.
If you use the little more nonconservative, if you look at the average mismatch rate among all six, you would still get 760,591. So that’s what you would get with that number. It’s a, it’s significantly higher, okay, when you pool everyone together, alright. And so let’s go here.
All right, let’s have a discussion now. So, in conclusion, as we’re wrapping this up, the intention of this pilot study was exploratory. We’re not here accusing anyone, we literally wanted to recreate how signature verification is going on. And by the way, the study has not been done. It’s quite amazing that it has not been done. We’re the first to do this study at the Election Systems Integrity Institute, which has published the work that EchoMail did. But this these are the conclusions, what we find is that of all of these early, early voting mail-in ballots, again, 1.3% 25,000, what was cured.
And out of all of these only .031% were confirmed to be mismatches post curing, that was only 587 signatures. Just to let you know, only 587 signatures out of that 1.91 million were finally confirmed to be mismatched. And 25,000 initially, before we went through curing what’s considered to be mismatches, so what we wanted to do was now look at different scenarios, because we presented a lot of scenarios here. So, if you look at this, this is all the possible scenarios.
So the first scenario is: imagine if we use the novices average rate mismatch, right, that would have been 20.1%. And this says 384,295 should have been cured. And if you apply the 2.3% of this, that means 8839 ballots should not have been allowed after curing. If you use the non FDE novices pooled consensus will be 12.6%. And that would come down to 240,902. In which case 5541 ballots should have been disallowed.
If you look at the average of the experts, that are much higher mismatch rate 60.3%, in that case, 1,152,884 should have been cured, which means 26,516 would have been disallowed. If you used the pool consensus rate, which means where all the FDEs agreed to the same common set of signatures not matching, that would have been 747,560, which would have meant 17,194 ballots should have been disallowed.
Now, the last piece is we’re averaging we’re combining novices and experts if you can, if use the average rate, that would have been 40.2%. As we mentioned, that would have been 768,591 ballot should have been cured, which means 17,678 should have been disallowed. If we use the most conservative number here, all six people agreeing that a certain set of signatures did not match was 60, it would have been 12%, which would still have been 229,000 out of 229,430 ballots should have been cured, which meant 5277 ballots should have been not allowed about 10 times more than what were allowed.
Obviously, this was a very, very close race. We’re not saying that this would have affected the outcome. But the point is that the signature matching rates that we’re discovering are 10x higher mismatching rates than what the county had. And then final conclusion here. So it really is this takeaway, if you take the most conservative people could say, I don’t agree with this. I don’t agree with this. I don’t agree with this. I don’t agree with this.
We’re giving people literally, you know, six different choices. So if you take even the most one give the county the benefit of the doubt, it would be 12%. Okay. So at the net of it, as a result show a minimum of 229,430 plus or minus 4%. That’s our margin of error should have been cured. Now if we subtract the 25,000 the county did cure, you would get down to 204,430 plus or minus 4%. That’s why we said more than over 200,000. Alright. So that’s what we have.
Now the final conclusions of this study, and by the way, I encourage everyone to go review this video. I know I’ve covered a lot here, but this video aims to be educational, but it is a detailed study. That goes into the detailed analysis of this. And the reason we did this is that it’s important to recognise that the election integrity movement really needs to take a scientific systems approach. Unfortunately, there’ve been a lot of Grifters, who jumped into this movement., and they’ve been taking advantage of people.
So, we started the institute, because those in power have created institutes at Harvard, at MIT, at Stanford. And these are really, really very smart academics who believe that there is no problem, okay? That there are no problems. So what we are sharing here is we took a very, very detailed Systems Approach. And we’re not here to hype anything, we want to invite people to have a discourse, okay. And so that’s why we did this.
And we want to encourage everyone who’s interested in that approach to study systems science, we offer a course on Systems Science, you can go to vashiva.com/join. But I encourage everyone to take a Systems Approach. We live in a world where there’s the left approach or the right approach or the pro approach or the anti approach. But ultimately, what a Systems Approach reveals is that you need to apply the scientific method.
We go into it like we did with this study. It’s a pilot study. We weren’t aware what was going to happen, we did the analysis, and we may do an extended study. So let me finish up with the conclusions. This is the first study to quantify the signature mismatching rates during signature verification of EVBs. Again, in Maricopa over 1.9 million voting ballots were received and counted. The county reported no more than 25,000 of these ballots had signature mismatches and required curing. A pilot study, as we said, recruited three novices and three experts to calculate the signature matching rates.
On the sample of the 499 EVBs, all six reviewers were presented the EVB envelopes to evaluate, and what we found out was all six reviewers in this pilot study concurred 60 of the 499, 12% as signature mismatches. Now based on the study over 200,000, early voting ballots should have been cured at most, versus at most 25,000 the county did now.
We want to be humble about this while the study is compelling and an expanded study is warranted to confirm the findings of this study. Okay. And that’s really the scientific approach. Again, the title of the study is, “Irreconcilable Differences, Over 200,000 Mail Ballots with with Mismatch Signatures Counted Without Being Reviewed, Cured in Maricopa: First Study to Calculate Signature Matching Rates to Provide a Quantitative Framework for Assessing Signature Verification of Mail Ballots.” Alright, I hope that helps people.
I hope this was educational. Someone else sends another question that says, “I think the paper ballots should have been watermarked on an official page. We’ve seen all the fraud.” Yeah. So look, the real issue here is that we did the ballot image analysis and we showed at ESII. If we had done that way before, we could have done this audit for pennies. The Grifters who did not want to give us the Ballot Images, which is really unfortunate.
I’m not sure why they did that, but the only conclusion you can reach is they wanted to do this paper ballot for 9-10 months when if they use a Ballot Image, they could have discovered this stuff very quickly. All right. What we discovered here is that you have to go way upstream in the process, even as the ballots are coming in and the signature verification has some weaknesses as we’re showing here. I hope this is valuable. We will be putting this up shortly on vashiva.com I wish you well. Be well. Thank you
It’s time we move beyond the Left vs. Right, Republican vs. Democrat. It’s time YOU learn how to apply a systems approach to get the Truth Freedom Health you need and deserve. Become a Truth Freedom Health® Warrior.
Join the VASHIVA community – an integrated EDUCATIONAL, COMMUNICATIONS – independent of Big Tech -, and LOCAL ACTIVISM platform to empower YOU to actualize Truth Freedom Health in your local communities by employing a SYSTEMS APPROACH.
The platform we are building for Truth Freedom Health® provides the infrastructure to take on Big Tech, Big Pharma, and Big Academia. Many of you have asked how you can help. You can contribute whatever you can. Based on your level of commitment to get educated, I have also created some wonderful educational gifts to thank you for your contribution.
To get the education you need and deserve, join Dr.SHIVA on his Foundations of Systems course. This course will provide you three pillars of knowledge with the Foundation of Systems Thinking. The three pillars include: 1) The System Dynamics of Truth Freedom Health, 2) The Power of a Bottom’s Up Movement, and 3) The Not So Obvious Establishment. In this course, you will also learn fundamental principles of all systems including your body.
Course registration includes access to his LIVE Monday training, access to the Your Body, Your System tool, four (4) eBooks including the bestselling System and Revolution, access to the Systems Health portal and communications tools – independent of Big Tech – including a forum and social media for you to build community with other Truth Freedom Health Warriors.
This course is available online for you to study at your own pace.
It’s time to Get Educated, or Be Enslaved.