Is In this discussion, Dr. Shiva Ayyadurai, provides findings from the Election Systems Integrity Institute’s recent audit of signatures on 1,911,918 early voting mail ballots in Maricopa County’s 2020 General Election.
The original research in this video is made possible by generous contributions from supporters of the Dr.SHIVA Truth Freedom Health® movement. Please contribute so we may continue to bring you such original research, valuable education, and innovative solutions.
- Dr.SHIVA Ayyadurai, MIT PhD – Inventor of Email, Systems Scientist, engineer, educator – presents updated Extended Study (“the Study”) along with the Pilot Study are the first to calculate signature mismatch rates of EVBs for Experts – Forensic Document Examiners (FDEs), Trained Novices (non-FDEs), and in a Two-Step Review process using non-FDEs and FDEs.
- At minimum, 215,856 early voting mail ballots (EVBs) should have been cured in Maricopa versus the ~25,000 cured by the County in the 2020 General Election.
- One constraint of this Study is not having access to the signature files from the County.
- Given the nearly 10x difference in EVBs to be cured between this Study and the County’s actual number cured, if the County were to provide their signature files, an update to this Study can be performed.
- Maricopa County Election Dept. states it has a “rigorous signature verification process.”
- Of the 1,911,918 EVB signatures verified, the County reported only 25,000 were flagged as signature mismatches requiring review – “curing;” and after curing, the County concluded only 587 of the 25,000 (2.3%) to be “Bad Signatures.”
- This Extended Study confirms the findings of the earlier Pilot Study and concludes that the process used for signature verification in Maricopa is a flawed signature verification process.
signature, mismatch, ballots, cured, study, people, signature verification, maricopa, analysis, calculate, match, county, novices, systems, election, envelopes, maricopa county, called, pilot study, experts, mail in, system, election system, extended, forensic examiner, general election
Hello, everyone, this is Dr. Shiva Ayyadurai. I hope everyone’s doing well. Today we’re going to be doing an interesting presentation, it comes after a lot of hard work. And it’s a follow up to a previous study that we presented a pilot scientific study we did I think about two or three weeks ago.
And in that scientific study, we shared with you the results, the initial results of our findings, about the quality of the signature verification of ballots that took place in Maricopa County. So I think that was on February, 22, 2022. The Election Systems Integrity, which we founded about a year ago, and we held a major conference a year ago, has now gone into full gear, we’re doing lots and lots of research projects.
And we publish work based on the work in this case of Echomail. And in that pilot study that we did, we analyzed close to 2 million envelopes that were on the – that’s where ballots go in. And our initial study was done for the Arizona Senate that we were commissioned to do.
And in that study, we were asked to look at whether in the box on the front of the envelope whether a signature existed or not. We weren’t allowed to do the signature verification. Subsequent to that.
And more recently, using those Ballot Envelope Images and using signatures that we were able to acquire publicly from the deeds repository in Maricopa. We did the first study of its kind in the world. It’s interesting in the scientific literature, no one has done a study of literally measuring the mismatch rates of signatures, if you present them to a trained staff member or an expert, if you give them one signature of someone, which is on the envelopes where people sign when they put in a ballot, versus their signature on file.
And comparing and doing enough of those to get what’s called a signature mismatch. And no-one ever had done that. We were the first to do that. We did a pilot study as I was sharing with everyone a few weeks ago.
And we came up with some extraordinary and compelling results, which said nearly at minimum over 200,000 ballots in Maricopa were counted, but they didn’t go through a process called Curing. So I’m going to cover that today. All right, so today, at that time, we did that study, we did a sample size of around 500, which represents in statistical terms 95% confidence.
Over the last several weeks, we’ve done a study that goes over 2,500. In fact, over 2,700, we’ve done two analyses within our what we call an extended study. And that study also confirms the pilot study.
International Center for Integrative Systems
And I want to share that with you. So I believe this is going to be educational for most of you. I hope you enjoy it. And let me first of all begin by letting everyone know that the Election Systems Integrity Institute is a part of another, you know, larger foundation I have called the International Center for Integrative Systems.
You can find it at IntegrativeSystems.org. And this center, I started close to around I think 14 years ago. And our center really is focused on educating people on the power of systems thinking. And we run many, many different programs and projects inside of it, anyone’s free to explore that. But if you go to the center’s website, you’ll see that we’ve done a whole range of projects on systems.
So one of the projects we’ve done is on for example, Systems Biology, the using of the systems approach to look at the interconnection of food systems. So we did quite a bit of work on genetically engineered foods. And we discovered some significant problems when the food is engineered and what it does at the plant systems level. We also have out of the systems work out of food. We run the international Clean and Raw food certified program that’s also part of the International Center for Integrative Systems.
For those of you who know of young people or young kids, let them know that we also have something called the InnovationCorps.org It’s an initiative that I started to really recognize young innovators and their age of 14 or 18. And if you go to the website, we still have grants available, but it ends on March 31st of this year, but young people, 14 through 18 can apply. And our center funds those kids with a $1,000 program. They get mentoring for four mentoring sessions, but really want to recognize young people.
Election Systems Integrity Institute
And that other part of the International Center for Integrated Systems is the Election Systems Integrity Institute, which focuses on doing publications and research on election systems. And that’s what we’re going to talk about today.
And for those of you joining, we’re going to be discussing one of the research projects we just finished on understanding the signature mismatch rates, let me just go right into that. And this is an effort out of the Election Systems Integrity Institute. I’m going to go right into that so people can, we don’t waste any more time. But here’s the cover of the program here, get it started up.
This is a cover of the report that we submitted. We recently updated it, in fact, this morning, we submitted it. And I want to thank our team for it. The version I’m showing you has certain information that was only viewable to the Attorney General of Arizona. But what this extended study confirms at minimum is that over 200,000, mail ballots with mismatched signatures were counted without review, which is called curing and you’re going to learn about that in Maricopa County, Arizona.
And I just want to let everyone know, this process of curing occurs in 22 states in the United States. And so what we think about Maricopa as a case study of potentially what’s going on elsewhere, so let’s just jump right into it. So what you see here, but first of all, this is the entire study, this will be published on our website.
But the key elements, and it’s a long study over 114 pages, which we put together, but let me just go to the executive summary, what we discovered was at minimum 215,856. Early voting mail ballots, which I’m going to call EVB’s should have been cured in Maricopa versus the 25,000 that were cured by the county in 2020. General Election.
This updated extended study, along with the pilot study that we did several weeks ago, are the first to calculate signature mismatch rates to the best of our knowledge of EVB’s. For experts, we’re going to call forensic document examiners, train novices, we’re going to call non FDE’s. Also, and in a two-step review process using both FDE’s and non FDE’s.
Now, one constraint of the study I want to point out to everyone is not having access to the signature files from the county, so we had to go, you know, mine and get our own signatures that were publicly available. Now, given the nearly 10x difference. Any EVB’s to be cured, we found out 215,000, the county had 25,000, between this study in the counties actual number cured if the county were to provide their signature files, an update to the study could be performed.
Let me provide some key elements of the abstract here. I think one of the things I want to point out here, is that you can see we calculated different mich mismatch rates based on different conditions, different experiments, we did all the way ranging from as low as 11%, all the way to highest 48.9%.
And you can see the number of ballots, the study predicts should be cured. We’re in this case, only taking the minimum amounts, I just want to let everyone know we’re being very, very conservative. We’re saying only at minimum, over 200,000 ballots should have been cured, but there could be higher levels.
So I want to just let everyone know we’re being very conservative in this estimate. Alright. So just a review. For those of you what occurred in the past is that we did an initial pilot study. And in that pilot study, we found out of the 499 EVB signature images that we use in that study, and by the way that represents a randomly selected sample from the 1.9 million that represents 95% confidence, such that the real value would be within plus or minus 4% margin of error.
Now, in that pilot study, we had six reviewers, three experts, called Forensic Document Examiners and three trained novices were presented pairwise images of signatures from the envelopes, and a genuine, genuine signature. And they all concurred about 12% of the EVB’s. This again, minimum or signature mismatches, the pilot study concluded that over 229,000 EVB’s should have been cured, versus the upwards of 25,000 that the county cured.
Okay. And though the results from that pilot study were very, very compelling, you know, we decided that it would be important to do an extended study, right? So, you know, in science, you sometimes you know, they’ll do a little biopsy, or they’ll, you know, you’ll do in engineering you’ll do a small test, just to get an idea, hey, is something going on there? If it is, then you go do a larger study. So we did the initial study, we found, we thought were some compelling results.
And now we did this extended study. So that’s where you’re, that’s what you’re getting today. Okay. So let me go back here. So, now this study used a sample of 2,770 samples, that’s five times larger than the pilot study, and that gives a margin of error for us around 99%. Again, a 99% with plus or minus 2.5% margin of error. So much more higher resolved study, more accurate study. Now this study, used a revised sample size, also has 2,379.
First, we used all 2,770. And then we did a revised study to even be more conservative. Let me explain what I mean by this.
So we took 202,270 images out of the 1.9 million. So that was our sample at 99% confidence level.
And we put him here, then we had to go get these signatures for those samples, well where did we find them? Well, the county did not give us signature files. So we went to the recorders database, which is a publicly available database in Maricopa where you can find people’s deed signatures. And we mined those.
So we have two signatures side by side, well someone could say, one of the A, our study could be inaccurate, because the signature that we got from the counties, was that really accurate? So we did two sets of studies. And I’ll show you what we did here. We first did 2,770.
And that’s called analysis A. But we also did another analysis, because what we discovered was when we did the first analysis, there were about 290 signatures where everyone said, Hey, these are definitively bad signatures. Everyone agreed these are mismatches to be specific.
So we said, Okay, if there are mismatches, that’s pretty amazing. Or it could be that we got the wrong signature. So guess what we did, instead of keeping them to make our accuracy improve and make people more confident, we said, we’re going to eliminate that.
Okay, and still see how it is. And that was a second analysis we did. And we also took out some others that we were even more being paranoid. Out of 2,770, we removed around 300. And I think 91.
And that was 2,379. And we did a second analysis, and I’ll share with you that. So we really did two analyses within this study. So let’s go right into that.
What we found was that I’ll share some of the so let me just go right into so what we what did we find the summary we found that if experts, forensic document examiners, that’s what they’re called FDE’s alone, were used to review the EVB’s, then at minimum 786,753 EVB’s should have been cured, or at a maximum 936,457 EVB’s. These were if experts reviewed, remember, experts paid a lot more scrutiny in these counties that do signature verification, they have trained staff, but the training they get is a few hours, okay. It’s not like their forensic document examiner.
We also looked at how the trained novices would do on our end, right, non-FDE’s. And in their case, they would have found at minimum 344,528 EVB’s should have been cured, or at a maximum 544,897. Now, the next thing we did was we emulated what occurs in Maricopa where if a ballot first to say, of the train staff volunteers, you know, people are not FDE’s, who find a signature mismatch.
It then goes to what is called a manager, someone with more expertise. And if they also say it’s a mismatch, then again sent to curing,curing is where they call people and I’ll go through this more to say, hey, maybe the person had Parkinson’s or they had a difficulty, and that’s why their signature is off. So it’s really a two step initial review process.
We simulated that by doing this two step process. So we had really three signature mismatch rates we calculated for both analyses, okay. And so let me go through that.
What the study revealed, at minimum is, you know, the 1,000, early voting mail ballot should have been cured. So this confirms the pilot study. And what we want to really bring out here is if the county were to provide us its signature files that they use, so we didn’t have to go do all this hard work of mining signatures from the public, then we could update this study, and also use machine algorithms to do a full analysis.
So let’s begin with the background. By the way, this is Dr. Shiva Ayyadurai, I hope for those of you who just joined us. This is a review, where I’m sharing with you the results of our extended study following the pilot study of calculating the signature mismatch rates and essentially really performing the first study of its kind in an extended way on signature verification. That’s what you’re joining.
So welcome, everyone. So the background to this is let me give you a little bit of education because some of this stuff may be new. So let’s understand what is signature verification, what is signature verification, well signature verification is, first of all, it’s a multi step.
What is Signature Verification?
It’s really a systems process. And it’s aimed to verify a signature based on review of two signatures side by side, one being the genuine signature, the other being a questionable signature. And in elections, what happens is, people put their envelopes, okay.
People put their ballot in an envelope in early voting mail, ballot envelopes. And it’s sent to a facility typically in the election office, where it’s scanned, so the envelopes are scanned to create an envelope image, then what happens is initial review is performed, before you even open those envelope to determine if the person who’s signed is in fact the person who they say they are.
Okay, so how is this done? Well, side by side, human beings review the ballot envelopes image, okay, the envelope image, which has a signature, and they also have a genuine signature on file. Now in Maricopa County, they use this. They have signatures on file, perhaps a person’s voting registration signature, or their DMV, motor vehicles.
And human beings reviewed all nearly 2 million or over 1.9 million envelopes. Okay, and they did this review.
So first, the train staff reviewed, right. And if this was found to be a match, we’ll talk about what happens there. If it was found to be a non match.
One of the things they did in this case is – they send it to, as you can notice in the sub bullet here, to a manager with more expert expertise to determine if it should be cured, okay. So if it’s a match, if the envelope is open, the ballot is processed, if it is not a match, then it goes through curing and in curing. If it’s found to be a match, then the ballot is sent to tabulation as it would be in the first step.
And if it’s confirmed not to be a match, then it’s denoted as a bad signature. Okay, so all of you joining us, this is a signature verification process. Maybe there’s variations that you’d have bipartisan people, etc. But this is really the process. Okay. It’s a multi step process.
Results of Signature Verification In Maricopa County
All right. In Maricopa here were the results from the 2020 election. Again, this is the extended study we’re sharing with you.
This is not the pilot, have you seen a version of this before, this is a much more intensive study that we did. Okay, so we see here that out of all the 1.19 million ballots plus ballots 25,000 were sent to be cured.
And that’s about 1.31%. Now, out of the ones that were cured, I mean, sent to curing 587 We’re finally decided to be bad signatures, which is 3 one hundredths of all the EVB’s all 1.9 million, and it’s about 2.3% of the cured. Okay.
So what did we do? Okay, what we did was first of all, step one, we selected a representative sample and we wanted in the extended study to have a very high confidence level. Okay, so we selected a sample to have a confidence level of 99% such that the margin of error would only be plus or minus two and a half percent, okay. And to achieve this, we needed 2,775 times more than the pilot study.
That’s what we got. So we organized First, the second set the data set of the envelopes signatures with their images, which is we had 2,270 of them. And then we now had to create a dataset of the genuine signatures.
The county in Maricopa did not give us the genuine signatures. So what we did was we went to the Maricopa recorder’s deeds repository. And so if your name was John Smith, we found John Smith’s deed.
But remember, there could be many John Smith’s. How do you know John Smith is John Smith? Well, it turns out, if you use a middle initial, it really hones it in. And if we weren’t able to find the middle initial, let’s say the minute initial first name and last name were there, then we accepted. But if we couldn’t find that middle initial, then we also look for the address.
And if we couldn’t find the address match, then it was thrown out. Okay. So it’s a multi step process, using both human and technology.
Now, if the county were to give us their stuff, we could update the study. So I just want to let everyone know, our constraint here was, we had to go and mine genuine signatures. But however, is this, as the slide says, Many forensic document examiner say many times signatures on a deed are much more reliable, because you have to have a notary do that. Okay, versus signatures when you sign a voter registration file. All right. So that was that.
So next thing we did was, we had two sets of people, as I mentioned, forensic document people and non forensic people, as we talked about here, okay. And then they were presented with these, an image on the left, which was the signature on the envelopes, and a genuine signature that we mined, and they had to do two choices, either select, either select that it’s a match, or it’s not a match, match or not match. Okay.
All right. So then what we did was, in step five, what we did to put it simply, we not only did that for the 27,70 we got results, which is called analysis A, but then we wanted to make sure because some people may critique us, they may say your genuine signatures may not be really genuine. So we did an interesting thing.
We said anytime all six people said it was a no match concurred. We could tell Wow, six people can occur to us no match. That was in the first analysis.
And the second is we said, let’s throw them all the way. Let’s assume that when everyone says it’s a no match, it means a signature that we got is absolutely wrong. So we did that.
So we throw away 290. Okay, close to 10%. And then we also found we were a little more, you know, deliberate on looking for a better middle initial, and we got rid of another 101.
In the second analysis, we took the 2,770. And we threw away 391 to get a sample that had potentially less error of signatures that were wrong from the deeds. Okay, so we really, really wanted to be, you know, as conservative as we could. So that’s what I wanted to say, again, if the county gives us their signatures that they have, we would have had to do a lot of this hard work. Okay. So let’s jump right into it.
Analysis A: Initial Set of 2,770 Samples
Okay, so the first analysis we did was on the sample of 2,770 samples. All right. And we did the first experiment – where within analysis A, where the goal was to determine the signature mismatching rates, using experts, forensic document examiners.
So we got three three forensic document examiners, and they were presented the pairwise images, right of the 2,770. And then we calculate what’s known as a pooled consensus mismatch rate. Okay.
We’ll talk about what that is. It means the probability out of how many times among all the three FDE’s that when they saw the same pair of signatures associated with an EVB, did they conclude as a match or no match? Okay. And we did that at each level.
It’s almost like three people voting, okay. And we aggregated all that. And the forensic document examiners. They were there to calculate, okay, for each pairwise signature, they were presented, and then we determined the distribution probabilities. And then we also determined that the probability of the mean of the properties across a 2,770, Determine the FDE pool consensus rate. So this was the data for each expert so you can see that just look at that for a little while.
And you’ll see each expert had a range from 23% of this EVB’s being mismatches all the way up to 71%. And that’s where each FDE’s; the Blue is their match rate, and the red is their mismatch rate. Okay? All right, so they have very varying mismatch rates.
Then this is if you want to see how their mismatch rates varied over time as they’re processing those 2,770. You can also see that, all right, then what we did was, we wanted to now figure out the pool consensus. So for every ballot, okay, from one to 2,770, we calculate, that’s what each one of these lines is.
The votes essentially, if one person of the three said it was a mismatch, that’s one out of three, right? 33%. If all three said it, it’s 100%. That’s why the scale here goes from zero to 100%.
All right, so we have this literally this histogram. Alright, so these are literally the probabilities of that. And then we calculated what’s called the pooled consensus signature mismatch metric, which I’m going to call Beta.
And we found that to be 48.98%. That means, among all the FDE’s, the experts 48.9% 48.98%, on average, they would say, hey, this early voting ballot has a mismatch. But remember, experts are seeing things that non-experts do not see.
That’s the first thing we did. And we call that variable Beta. Next thing we did was we said, okay, let’s just group how they, you know, voted.
These are their voting 100% time when they’re saying it’s a mismatch to 0%. And you can see the distribution of ballots, okay. Just a nice graph here.
At the end of the day, if you use the FDE’s, to have cured, they would have cured 48.98% or 936,457. ballots, okay, experts.
Then we said, Let’s do it with trained novices, non experts, non FDE’s. And in this case, the county has a guide, and that guide was followed. Okay.
And again, we presented them with the same set, they also did the pool consensus, same process. And here are those results. So you see ranges from 23.1%, to 31.2%. These two novices are trained novices, we’re very, very close.
You can see their results here. Okay, but bottom line, it’s in this range. And then we did the same thing for calculating the pool consensus.
Again, every line here is a probability, it’s literally a vote for literally what they did. It’s not a probability, the actual results of each non FDE’s or trained novice how they voted on each ballot. And that helps us calculate a different signature mismatch rate for the novices, which we call Alpha.
And that turned out to be 28.5%. Okay, and again, you can see a distribution of their votes across the various ballots.
This means 1,481 ballots, none of them said was mismatched, which means they’re all matches all the way down to 363, which they said, all three said that they were mismatches. Okay. And then in between, alright, then what we did was we calculated the ballot.
So non FDE’s, if they were allowed novices, trained novices, they would have said close to a half a million or over half 1,544,897, should have been, should have been sent to curing. Right. Then we did something more interesting.
Determining Two-Step Review Signature Mismatch Rate
Remember, what really happened? So the case we’re looking at is if just the FDE’s did it over 900,000, they would have said it would have cured if just the novices did it over 500,000. But in the election process, of curing signature verification apparently in Maricopa, it’s a two step initial review process. First, the trained staff, in this case, a trained novice reviews it, and if they say, Hey, this is a mismatch, it goes to an expert, the manager, and then the expert reviews those again, so it’s really a joint function, a joint probability.
Okay, so we did that in the extended study a little more sophisticated than we did in the pilot. Okay. So here we go.
So in Maricopa, again, the initial view involved train staff. And that’s what I’m just describing here that we did. And this is another way to view it.
Just so people don’t get scared with the math here. It’s actually pretty simple, but just to keep it straightforward. E represents all the ballots.
It’s really a unit vector of all the 2,770 ballots that came in, and then these are reviewed by the trained staff and there’s a signature mismatch rate of Alpha. So what comes out of this that would be sent to the FDE’s. It’s actually multiplied by Alpha.
But sometimes we will call it a dot product, cause these are actually two vectors, okay? But these number of ballots are now no matches and need to be reviewed by a manager. The manager has their own signature mismatch rate, which we’re calling Beta, which we calculate it, and they would come up at the end, you’d get so many ballots would be E times Gamma, and Gamma is the combined mismatch rate, but in joint probability to be specific among both the trained novice and the FDE. Okay, so we calculated that, okay, so it’s quite a bit of work, but I’ll walk you through it.
Calculation of EVBs Determined by non- FDEs to be No Match
So what we found was if you calculate the ballots in this first case, that’s coming out right here, E times Alpha, that turns out, it’s 790, early voting ballots would go to the managers for review out of the 2,770. Okay, that’s right about there. Then, in order to calculate Gamma, we need the joint signature view mismatch, right? So we plotted that, and then we calculated Gamma, which turns out to be 22.27%. Okay, 22.27.
And if you go to that; 22.27, if you notice, here, we look at the 22.27, we get various possibilities, because now two people are voting on this.
Alright. And you see here that 617 ballots would have gone to curing because this is how many came through both people reviewing it. And if you work that out, that’s 22.27, which means 425,784. So what this means is that if we followed the two step process in America by using both people, this would mean that over 400,000, early voting ballots should have been cured. Okay.
Now, obviously, people would say, hey, well, this seems high. And the reason your study is flawed is because you didn’t use the same signatures that the county did. Remember, we went to use the signatures that we got from the deeds repository.
So let’s again, if the county gave us their signatures, this would not be an issue we would just use that. Okay.
We sort of stepped back and we said, okay, let’s be even more strict, maybe some of those 2,770 pairwise things, we should throw away some of them, maybe some of them that we, you know, whatever we suspect could be, the genuine signatures can be bad, let’s throw them away. And then, you know, redo the analysis. So that was really analysis B.
Analysis A Summary
Again, we were being conservative here. And by the way, just to summarize the analysis A, these are the different signature mismatch rates, and these are different ballots that have been cured. Okay.
That’s what the expert summary showed that I just walked you through. What we now did was pursuant to what I just said, we applied these additional constraints, we were really being sort of paranoid, hey, let’s remove any of those potential genuine signatures we got that could potentially be bad.
Updated Analysis with Additional Constraints
So how do we do that? We did two very interesting things here. We used we, we removed another 101, where we did further scrutiny where we looked at the middle initials and, and we did more scrutiny on the middle initials matching. And if the middle initials didn’t match me, again, check the addresses.
But so we found about 101 there, and again, being conservative, but then we said, let’s look at all the ones from the previous analysis that the FDE said, we’re all three said we’re not matching. That’s 582. And let’s look at all the ones that the novices, trained novices said we’re not matching.
That’s 363. And then we said, how many of them do both agree? All six agree, how many did they agree we’re not matching? Okay, now, what could that mean? So, experts and novices are saying that 290 pairs do not definitively match. So we said, Okay, why, what about saying, it’s called the wisdom of the crowds.
In statistics, we said, why don’t we say all of those are bad, right? It’s could be two possibilities. The genuine signatures we got are bad, or they’re indeed mismatches. But we took a conservative case, and we said, let’s just toss them all out.
Okay. So again, giving the benefit of the doubt to our critics who may say, Hey, you got the wrong genuine signatures. So we did that.
Example of False Negative
So we took out all those 290 Okay, from here. So, at the end of the day, we removed 391 pairwise signatures and we ended up with a new sample of 2,379. Okay, so now there were what’s interesting is among those 209, just to let everyone know that we removed, there were some that were so clearly genuine signatures, because we had the address match, middle initial out of that 290 pool, but we threw them all away, just to let you know that we’re being concerned, we even threw away definitively genuine signatures, okay? But that’s fine.
Because we want to, you know, lower our probability of error. So now, I can’t show you these because they are constrained by their signatures here. So let’s show you the summary of the updated analysis.
Analysis B: Reduce Set of 2,379 Samples
This is analysis B. So now in analysis B, what I’m sharing with you is we did analysis A, we came up with the range of potential ballots, now we’re doing a more conservative estimate, where we’re throwing away anything that could potentially be not a genuine signature from the deeds.
What do we have here? So we’re looking at 2,379 samples. And again, we ran experiment one, like I’ve talked about, we did the pooled canal analysis. And this is what we find in that in the new analysis B, we find the ranges go from 12.4%. So they lowered to 66%. Okay. And those are the mismatch rates by FDE’s, forensic document examiner still high, but lower than analysis A, and this is their distribution of for each individual EVB, how they voted on them. Okay.
And you can see, in this case, the Beta is 41%, in the earlier case it was close to 49. So it’s dropped by 8%. Okay. But still 41.15%. And this is, again, the group consensus probabilities of those, and that we find here is, in the second analysis B, if the experts reviewed these EVB’s, 786,753, what would have been sent for curing, okay. Then we did experiment two with the novices.
Alright, just like before and analysis A, analysis B for doing this. And here, everyone knows we did the same process, same pool consensus, as I’ve talked about, and this is what you find, we find that among the non forensic document examiners, the rate goes from 12.7% to 21.4%, again, lower than the original analysis A, and again, we did the distribution to calculate the pooled consensus, right, and we found that to be 18.02%.
What does that mean, then just like before, that means if you gave a signature to a trained novice; 18% of the time, they would say, hey, that is a no match, and they would send it to their manager, just like, if you gave something to a expert, they would say, 41% of the time, that ballot is a no match. Okay, so experts 41.15%. And, novices, trained novices 18.02%.
Alright, so now we want to put it all together. So we did the same thing, where we, that’s 18.02%. And by the way, that would mean 344,528 ballots would have gone to hearing if you use the novices rate. But now we said let’s apply the two step process. And in the two step process, we know as we talked about Maricopa has trained staff do it, then they send it to the manager.
Determining Two-Step Review Signature Mismatch Rate
So we did the same thing. But notice this time there are 2,379 pairwise signatures. And so first, we calculate how many would have gone to the manager, which is E times Alpha, and that is found to be 429 EVB’s, less than before, okay, which is what we would expect.
And then we want to calculate what is the two step mismatch rate among both the novices and the FDE’s and we find that to be 11.29%
That means 269 ballots at the end of the day would have been sent to cure. Okay, so we put it all together. 11.29%. Again, this is a very, very conservative number, because we’ve now eliminated any ones we were potentially thinking we’re not genuine signatures, and we’re redoing the analysis. Okay.
What do we find? We find in the most conservative case 11.29%, 215,856 ballots should have been cured. Okay. All right. So let’s compare that now. So what that means is that when you line up in this second analysis, a more conservative analysis, it goes from 11% to 41%.
But we only use 11.29%. To put forward that at minimum 215,856 ballots should have been cured, that is far more than what the county cured, which is 25,000. Okay. So that’s so discussion wise, you know, the county cured 25,000, which is 1.31%. And what we’re saying here is that 11.29% should have been cured, which is about 10 times more, okay.
Let’s see this. So the discussion is the following. Based on the Extended Studies you’re seeing here, that yields a minimum signature mismatch rate of 12.9%, and the county’s post curing mismatch rate of 2.3%.
That means 4,965 EVB’s. At minimum, that’s what I see right here, should have been thrown out versus 587. So that’s important in such a close race.
All right. And the conclusion that we have, that we want to put forward is the Maricopa County Election department states that it has a rigorous signature verification process, if you read their stuff, they say they have a rigorous signature verification process, but they only cured 25,000. And our extended study confirms a pilot study and puts forward that they actually have a flawed signature verification process.
That’s our conclusion here. And we want to again, make it clear the extended study found if FDE’s alone were used to review it, they’d be at minimum 786,000. If the non FDE’s did it would be at minimum 344,000.
And it both did it through the two step process; it’d be 215,000. Now again, I want to again state this one constraint of the study is not having the signature files from the county. Now, given the nearly 10x difference in EVB’s, to be cured between this study and the counties, actual numbers cured.
If the county were to provide their signature files, an update to the study could be performed. So we’re more than ready to do an update. And there are a number of things that we propose for future research, but which I’m not going to get into, but essentially saying that we could continue this study in a number of ways.
So anyway, I hope this was helpful. I know, it’s, it’s a lot of detail. But there’s no other choice sometimes, you know, to bear with it and dig deep. And that’s what we’ve done. You know, the election integrity movement. Right now, in my view there are three different things going on.
One set of people deny there’s any problems. And there’s major institutes; at Harvard, at MIT, and Stanford, who are all about denying there any issues they write off academics writing all these papers. That’s one hand.
The other hand is you have what I call Grifters. The grifters spend all of their time, frankly, talking about nonsense, okay? Nothing burgers. And it’s unfortunate because there are real system issues.
There are real systems issues, which is what our institute was created for to really go after from a scientific perspective, those systems issues. So ElectionSystemsIntegrity.org was founded on that. And we have some incredible people working with us. And our goal is really to challenge the existing academic establishment, which believes “nothing to see here, move along,” right? That’s what’s happening. On one hand, one set of the academic institution wants to hold on trying to capture this here. I just want to bring back the Election Systems Integrity Institute site.
The bottom line is that you have these two groups, the Grifters who are essentially putting out garbage and they’re all They’re helping the deniers. But when you take a Systems Approach, it truly systems approaches we’ve done here, where you look at the entire system flow, and you go down step by step.
Now the problem is a Systems Approach. And when you do real stuff, it takes a lot more effort. I think over the last four weeks, I, myself and my colleagues have gotten about three hours of sleep a day doing the analysis. When you do the real work, you don’t have time to do the grifting, unfortunately.
Right. So this is why a scientific study like this is really important, because we’ve gone down, and we’ve looked at things in various ways. We question ourselves, we critique ourselves, we want criticism. And what we’ve concluded here is this study, as confirmed by the pilot study where we dipped our toe in the water.
And now we did five times more samples with 99% confidence, two and a half to 2.7% margin of error. And we’ve concluded that at a minimum of 200,000, early voting ballots should have been sent to curing.
The other point is, we are extremely open to working with Maricopa County, if they give us signature files, maybe our genuine signatures are all crap, fine. We’ll run it through. And we’ll see what results we get. That’s how science works. Anyway, I hope this was valuable for people. Let me see if there’s any questions people want me to answer.
I hope this is valuable. This is Dr. Shiva, please go to VA Shiva calm if you want to find out more about what I do. And by the way, I have a little scrolling thing I want to mention for if there are young people that you know, between the age of 14 through 18, and they’re interested in applying for the Innovation Corps award, again, we want to recognize young innovators. We award them a $1,000 Check.
We give them mentoring, four lessons of mentoring. We recognize them but it’s really to recognize that great innovations can occur anytime, anyplace by anybody. All right, everyone.
Thank you. Be well have a good night. And I hope if you have any dinner you have a good dinner because that’s where I’m going to right now.
It’s time we move beyond the Left vs. Right, Republican vs. Democrat. It’s time YOU learn how to apply a systems approach to get the Truth Freedom Health you need and deserve. Become a Truth Freedom Health® Warrior.
Join the VASHIVA community – an integrated EDUCATIONAL, COMMUNICATIONS – independent of Big Tech -, and LOCAL ACTIVISM platform to empower YOU to actualize Truth Freedom Health in your local communities by employing a SYSTEMS APPROACH.
The platform we are building for Truth Freedom Health® provides the infrastructure to take on Big Tech, Big Pharma, and Big Academia. Many of you have asked how you can help. You can contribute whatever you can. Based on your level of commitment to get educated, I have also created some wonderful educational gifts to thank you for your contribution.
To get the education you need and deserve, join Dr.SHIVA on his Foundations of Systems course. This course will provide you three pillars of knowledge with the Foundation of Systems Thinking. The three pillars include: 1) The System Dynamics of Truth Freedom Health, 2) The Power of a Bottom’s Up Movement, and 3) The Not So Obvious Establishment. In this course, you will also learn fundamental principles of all systems including your body.
Course registration includes access to his LIVE Monday training, access to the Your Body, Your System tool, four (4) eBooks including the bestselling System and Revolution, access to the Systems Health portal and communications tools – independent of Big Tech – including a forum and social media for you to build community with other Truth Freedom Health Warriors.
This course is available online for you to study at your own pace.
It’s time to Get Educated, or Be Enslaved.
You must be logged in to post a comment.