September 24th, 2021. – Dr.SHIVA Ayyadurai (MIT PhD) briefs Arizona Senate Officials on his final report. (VIDEO).
Dr. Shiva’s Final Report to the Arizona State Senate, submitted on September 24th, 2021. – Pattern Recognition Classification of Early Voting Ballot (EVB) Return Envelope Images for Signature Presence Detection.
PowerPoint Presentation Made by Dr.SHIVA Ayyadurai (MIT PhD) for Arizona Senate Public Hearing on September 24th, 2021. – Pattern Recognition Classification of Early Voting Ballot (EVB) Return Envelope Images for Signature Presence Detection.
October 12th, 2021 – EchoMail’s Replies to Maricopa County Recorder Office’s Responses from October 6, 2021.
October 6th. – 67 Page Report of Maricopa County Officials Using Local Blog To Malign Audit report. – How Election Officials Use Media Proxies to Disseminate Misinformation & Disinformation to Avert Investigations of Election Malfeasance. – A Case Study of Maricopa County Election Officials Effectively Using a Local Blog Purporting “Independent Journalism” to Malign An Auditor Who Reported Anomalies in 2020 U.S. Election.
Letter to Maricopa Officials Upon Discovering EVB Return Envelope Images Are Compressed. – Response to County Explanation of Stamp “Behind” Envelope Triangle
- It is unknown, per the CANVASS report, how many EVB return envelopes were originally received Maricopa election officials. EchoMail received a data set of 1,929,240 EVB return envelope images that were represented to EchoMail as being the set of all EVB return envelopes originally received by Maricopa. However, the CANVASS report does not document how many were originally received Maricopa election officials.
- EchoMail identified 34,448 EVB return envelope images being 2-copy, 3-copy and 4-copy Duplicates originating from 17,126 unique voters, while no Duplicates were reported in Maricopa’s CANVASS report.
- 6,545 more unique EVB return envelopes were processed by Maricopa than identified by EchoMail. 462 more “No Signature” EVB return envelopes reported by EchoMail. EchoMail identified 1,919 EVB return envelope images with Blank or Likely Blank in the Signature Region i.e. “No Signature.” Maricopa reported 1,455 “No Signature” EVB return envelopes.
- 2,580 Scribbles identified by EchoMail in the Signature Region of EVB return envelope images. A “Scribble” is when a Signature Region on an EVB return envelope image contains a non-white pixel density between 0.1%+ to 1%.
- Maricopa reported 587 “Bad Signatures,” which is 0.031% of the total EVB return envelopes received by Maricopa. Though EchoMail was not commissioned to perform Signature Verification, if EchoMail’s identification of 2,580 scribbles were all designated as “Bad Signatures,” that would be 0.134% of Maricopa’s total EVB return envelopes received. This percentage is at least four times more than the “Bad Signatures” percentage reported by Maricopa.
- While the number of EVB returns envelopes in Maricopa for the 2016 general election increased from 1,257,179 to 1,918,463 EVB return envelopes for the 2020 general election, representing a 52.6% increase (or by 661,284 EVB return envelopes), the number of rejections from Signature Mismatches of EVB return envelopes, from 2016 to 2020, decreased by 59.7%. This inverse relationship requires explanation.
- 9,589 more EVB return envelopes were submitted for Signature Verification by Maricopa than the EVB return envelope images identified by EchoMail as having signatures.
- A full audit of Maricopa’s Signature Verification process is necessary, and can be accomplished by comparing each signature on EVB return envelope images with an image of the voter’s signature from voter registration files. This will provide a scientific quantitative metric to assess confidence level of Signature Verification.
- Disclosure of Maricopa’s Standard Operating Procedure (SOP) for EVB processing, Chain of Custody, and Signature Verification methods, including the SOP and methodology for curing questionable signatures, is necessary.
Transcript Of Dr.Shiva’s Live Presentation To Arizona Senate Officials
President Fann: Our first presenter is joining us via zoom, because he lives on the east coast. His name is Dr. Shiva. He was the one commissioned by the Senate. And that was because we finally got the envelope ballots, or the ballot envelopes. So we could check signatures or lack thereof. And this was his part of the audit. Dr. Shiva. Welcome to the Arizona Senate. How are you today?
Dr.Shiva Ayyadurai MIT PhD: Thank you very much for having me. Honorable President Fann. Thank you very much.
President Fann: Absolutely. So
Dr.SHIVA: Thank you too Senator Peters,
President Fann: Thank you, the floor is yours. Please proceed.
Dr.SHIVA: Thank you, before I start a presentation, this is also accompanied by a report, I just want to make one preparatory remark and that is, as a follow up to Senator Peterson, we live in the age of what we call engineering systems, complex systems, taking an airplane using an iPhone, self driving cars. These are highly complex systems,which the modern world knows as engineering systems. Our election voting systems are also engineering systems.
I want to thank the leadership of the Senate, the stakeholders, because they’ve taken a historic step here to bring the same level of engineering systems capabilities to election voting systems. And one of those important attributes is that whether an anomaly is small or large, or insignificant or monumental, it all must be welcomed, because from an engineering standpoint, it can only do one thing: we just enhance the system.
So, as President Fann said, I’m going to be sharing with you a particular area that we looked at, which was looking at the election, early voting ballot, return on envelope images. And that’s what I’m gonna share with you. So let me begin.
I’m going to start up the PowerPoint – The title of this talk is Pattern Recognition Classification of Early Voting Ballots Return Envelope Images for Signature Presence Detection. For signature presence detection. We’re not doing signature detail verification, but we’re doing signature presence detection, and we’re taking an engineering Systems Approach.
I will go through this agenda. We’re going to give a little bit of background, we’re going to review the Maricopa results, we’re going to go through what our analysis resulted in, compare them, then report on some of the key findings and anomalies. Then also raise questions for Maricopa officials, and then propose some recommendations.
The executive summary of this, to sort of summarize a whole talk here today, of the findings is that the early voting ballot, the return envelope, that’s really the protective vehicle by which the EVB, which is the early voting ballot, is transported and processed. And the authentication, the verification of the signature on the EVB return envelope is critical to the reliability of the entire process. And our audit reveals anomalies, raising questions on the verifiability of the signature verification process.
Let’s begin. A little bit of background, you can read more about it, but I was honored to be selected because my background for more than 40 years has been in the field of pattern recognition and classification, particularly Systems Science, and it goes back a long period of time. But just very briefly, in this field, you look at reality. In this case, we’re looking at the early voting ballot system, and then you try to model it. That’s typically what science and engineers do.
For example, for many years I looked at how babies were processing sleep and trying to understand their system of looking at sleep patterns. Same in the area of deaf blind communications. You have the reality, you have signals, same in the area of looking at a bridge, and you get signals from it.
Based on the signals you’re trying to predict what’s going on in that bridge. And same if you look at an aircraft wing without having to open it up, you look at these signals, and you’re trying to predict what’s going on. And for many years, we did a project on looking at handwritten bank check numerals. And these different areas, including Email, looking at what’s inside of an Email as a system, these are all pattern recognition problems.
And in fact, if you look at food, trying to figure out will this food cause different kinds of diseases. But the reality in this area is that you are looking at the particular real world and you have a model. You can look at the heart and you can look at the cardiovascular system of that. And you can say, hey, I have some states of the system which are normal signals, and some states which are abnormal signals.
In this case, what we’re looking at is we have the early voting ballot system. And we have the early voting ballot return envelope images. These are the signals that we received to try to figure out what’s going on inside that system. You have the reality of what took place, and you have the expression of that which is embodied in those images.
And this is what one of those images looks like. You have an image, you have the name of the voter, and here you have the signature. And we were asked as a part of the scope of this audit not to look anywhere else but to look in the signature region, and to look to see if there was a normal status signature there, or the abnormal states.
Blanks split into two which we’ll talk about in more detail, a complete blank, or likely blank, or what we call a scribble. A scribble is something trying to get into the direction of hey, was this signature even valid. And it’s a very, very rudimentary step we took. Not full signature verification.
So, early voting ballot system and you have these images. Before we head into the actual process we did, let’s look at the reality of what took place. This is on the left side, the actual reality of what was reported by Maricopa County in their voter education report, what was called their CANVASS report.
What we see here if you look carefully, here are the numbers that they reported in 2016, but more importantly we’re concerned about 2020. And in 2020, one of the lines in their report is called the early ballots verified and counted, which has 1,915,487. And the rejected early ballots, which are these numbers are the ones that were not included in the final count.
Bad signatures, which means they went through a signature verification process. And out of that, 587 out of the 1,918,463 signatures, all of them were deemed to be bad signatures. And the way we get this number is you take all of the ones that were verified and counted, the ones that were bad signatures, there were some which were no signatures which were blanks, and then late returns.
To be clear, according to the report, this was the total number of early voting ballot return envelopes received by Maricopa County. To make it a little more clear, we have 1,918,463 total unique EVB return envelopes, of which 1,455 were found to have no signatures. And at the end, there was an amount of 1,917,008 ready for signature verification. And as a result of that, 587 were found to have bad signatures.
Now signature verification, it’s a detailed process, you can read the report, but counties vary. And Maricopa is considered along with Florida, apparently to have one of the better signature verification processes. But according to that, out of the entire lot of 1,918,463; 587 after this verification process where someone looks at the signature, and they look at the voter registration signatures, and they’re doing typically what’s called a 27-point analysis.
So, 587 were found to have bad signatures, 934 with late returns, for a total amount which matches with what was reported of 1,915,487. So, this is Maricopa’s reported results to the public.
I’m going to go through the scope of this audit by also going through the process. It’ll be somewhat educational. When early voting ballots, which is a subset of mail in ballots, are submitted, those ballots are imaged, which means scanned, converted to digital images. The EVB return envelopes, we’re talking about the outside covering, that important covering within which contains the ballot is scanned before they’re even tabulated. The EVB return envelopes are opened and scanned. And those create images.
In fact, there are six different kinds of EVB return envelope formats. Let me walk you through some of them. This is a majority of them, we call them the standard image files. This is for our lingua franca, this may not be the standard language, but for the purpose of this presentation. This is one way that they look.
There is also another set called UOCAVA image files, and there’s three different kinds. You have the signature here, the date here, the name of the voter here, another version of that is a little more complicated. It’s more of an affidavit. The signatures here, the name of the voters here, and obviously we redacted stuff for privacy.
And the third type of the UOCAVA is type C. So, you have UIF: A, B, and C. Then finally we have large print image files LIF, and these are for people who have trouble seeing, so the instructions and everything are larger. And then finally, you have the Braille format. So again, six different image formats that we have to process.
Let me walk you through the counts that we got; we got a drive with all of these images on them. The SIF, which was the first one I showed you, there’s 1,919,598 which is about 99.5%. The UIF-A 8,849 at 0.459%. The UIF-B, 277 at 0.014%, and we go to the UIF-C, which are the affidavits which are a small set which are 0.001%.
And then finally, we have the large formats which are 475, 0.024%, and the Braille about 29 at 0.02%. That added up to all of them. For the audit, we were provided by the Arizona State Senate 1,929,240, and these were apparently all the EVB return envelopes that Maricopa County got.
What we noticed when we looked at this, which was fascinating, I guess wasn’t an anomaly, but something we were a little bit surprised because we thought the duplicates would have been removed, but there were duplicates in here.
So, we had to remove duplicates. Out of those sets there were duplicates. These were the duplicates. We found 16,934 voters who had submitted 16,934 2-copy duplicates, which means each of those voters submitted two ballots to early voting ballot return envelopes.
Interesting enough, we also found another 188 voters, unique voters, who had submitted 376 duplicates. The total image count was 564, once you’ve removed the duplicates, right, because you’d want to take one of them. And then finally, we found four individuals who submitted 12 duplicates, total images being 16.
The total duplicates was 34,448 total images of which 17,332 or 322, were duplicates from 17,126 unique voters. By the way, this was not reported in the report. When you look at the first level of analysis, the data that we got 1,929,240 EVB return envelopes received, we subtract out the duplicates, and then we have 1,911,918.
The process that we were really commissioned for was to do a very basic analysis to see if there is a signature there or not. Signature presence detection. In that process, the goal is, is there a signature there in the signature region. Interestingly enough, some people don’t follow instructions, they write all over, but it wasn’t our job to go scan everywhere, our job was to look in that signature region. So again, specifically, we looked in that signature region, one category of the signature region has a signature, another categories blank, likely blank and scribble.
Just to keep it real simple, there’s a very simple classification. If it’s 0% non-white pixel density, which means nothing in that area, it’s a blank if it has a little 0% or greater. Still, no more than 0.1% like this example here, we put it into another category just for early classification purposes. We call it a likely blank, and then if it was greater than 1% density, that was denoted as scribble.
Then what we did was we did a distribution curve, we looked at all of the ballots, and we did a distribution of pixel densities to give you an idea of how liberal we were and accept something as a signature.
If you look at this, it was only this little area here, which is between point 0.1% to 1%. That’s the definition of non-white pixel density, that’s a scribble. Everything else we were quite liberal and accepted those as signatures even in our analysis. So again, we were not hired to do signature verification. Now, after this, typically signature verification would take place which is what the county did. And then they would also open up the envelopes and tabulate them.
Let’s go to our signature presence detection. We took a ballot, or it could be any ballot, we classified it into one of these six categories, that’s what the system did. And then for four of those, we removed the duplicates as I shared, and then we looked at the ballots. Once we knew which one it was, we identified what kind of region, we extracted the region. If possible, though, this was not part of the scope, we also tried to extract the name. We were able to do that for close to I think 99% of them. We also used different types of classifiers, which is in pattern recognition lingo, for each one of these different kinds of ballots to ultimately put them into one of these four buckets.
Dr.SHIVA’s EchoMail® Analysis of Maricopa County EVB Return Envelopes
Let’s look at the results. We’re first going to look at the non-duplicates which means all those early voting ballot, EVB return envelopes which are non-duplicates. And here we see predominantly 99.77%, we denoted a signature. Again, anything greater than 1%-pixel density. These scribbles we found were 0.13%, 2,420. And the blanks fell into two groups: the definitive blanks of 1,771, and the likely blanks at 101. These added up to 0.1%. Then we also looked at the counts. Just to be clear, we have 2,420 scribbles and we have 1,872 blanks; we put both of these together in our sum total.
Similarly, we now went and looked, this was a little more complicated, at the 2-copy duplicates, because we had to deal with those. There were a substantial amount, 34,000 and more. The duplicate recognition split it into three groups, but we had subgroups. There could be, if you think about it, the mother and the son or the mother and the daughter, or the parent and the child relationship. One of the two, one could have been a signature and one could have had a signature.
One could have had a signature, the other could have had a blank, one could have had a signature that could have been a likely blank, one could have had a signature and C denotes scribble. If any one of them had a signature, we said okay, we call it a signature. Alternatively, in the scribble area, one could have had a scribble, one could have been a blank, one could have been a scribble, one could have been scribble and a scribble. These were denoted as scribbles in the duplicate case. And then finally, you had ones that are blank, blank, blank, and likely blank, those were denoted as blanks.
Just to give you some feeling of what these look like, here is one from the same person. Again, we’re not allowed to share the names of voters, etc., but you can see here’s one where there is a signature here and here, these are two ballots, both signed. Another example is one is signed, and the other is a complete blank. The signature regions we’re concerned with. Another is signed and the other one is a likely blank, it’s got some stuff here.
EchoMail® categorized that as a likely blank. Here’s an example where you have assigned one, but you have a scribble over here. Okay. And then here we have a scribble. But here you have a blank, but it has been “Verified and Approved”. And we’ll talk about this later, we’ll come back to this. Here you have one that’s a scribble little bit, some stuff here, and a likely blank and the one with the scribble was approved.
Here you have two scribbles and one of them was approved the exact same ones. And here you have one with a blank and a blank and this blank has been approved. And here we have one which is a blank and a likely blank and a likely blank with a little dot over here has been approved.
And similarly, over here we have a signature and a likely blank and again, the likely the blank is being approved. And here’s an example of a three copy one, we have three copy duplicates, all blank. And we’ll come back to showing how they were approved or not. And here’s an example of three copy scribbles.
Alright, so that gives you an understanding of the flavor of these. The analysis we have to do of these two copy duplicates. And so, on those two copies we categorize these ones in green as signatures. The ones in red are scribbles. And the ones in black, as blank ones. So, you got to do a half for two copies 16,934, and these are the counts on that, so you have 155 scribbles from the to copy 45 blanks from the two copies.
Interestingly enough we also had three copies and for copy analysis we had to do, I’m not going to go into all the details of this. But there were also three copies and four copy duplicates; there were two sets of three copies which were blank, I’m sorry, two sets which were scribble five, which were blank. Now, we do the totals / calculations. If you add the non-duplicate blanks 1,872, two copy duplicate blanks 45, and then three copy duplicate blanks you get 2, so 1,919 Blanks. Similar if we do the scribbles. 2,420 for the non-duplicates, 155 for the two copy duplicates, 5 for the three copies, we get 2,580 scribbles.
Our net results state that we received 1,929,240. The 17,322 duplicates were removed to get this many unique return envelopes. And then we subtracted the non-signature and what we call scribbles to come up with a total of 1,907,419. This is what we would have sent to, in this analysis, should have gone to signature verification. Now, let’s do the comparative analysis, which is looking, side by side, what EchoMail® uncovered from the audit and what Maricopa had. How many EVB return envelopes received, well in the disk drive that we received, this is a number, but we don’t know in fact; it’s not reported in the Maricopa report how many actual return envelopes they received. The duplicates, we have 17,322, which will be subtracted, this is unreported.
When you compare the unique EVB return envelopes, what you find is that Maricopa has 6,545 more than we had in the possession of those image files after your real duplicates. Now we go to the signature presence detection, we discovered 1,919 no signatures, they reported 1,455. So in this case, we have 464 more blanks. In the scribbles, they don’t have a scribbles category. And we’ll discuss that if we start thinking about the scribbles as potential bad signatures, we’ll get some insight. But at the end of the day, what went to signature verification at Maricopa was 1,970,008, which is 9,589 more than we would have sent pursuant to this analysis. They also had bad signatures and late returns that we’ve talked about.
Now, I want to go to the key findings and anomalies, you’ve seen the process, you’ve understood the methodology, we’ve gone through some detail. Let’s go to the highlights here. These are the key findings, and I’m going to go through each one of them. One of the key findings is it’s unknown how many EVB return envelopes were originally received by Maricopa; we had 34,488 duplicates from 17,226 unique voters. It’s actually duplicate images. Maricopa reported no duplicates in the CANVASS report. We have 464 more no signatures which would be blanks than identify, that were identified by us versus Maricopa. Maricopa has 6,545 more unique EVB return envelopes.
This is what’s interesting. If you consider our scribbles again, a very, very low tolerance and pattern recognition having done this for over 40 years, you know, we could have used like 36 features, we use one single feature: pixel density. We would love to use more features by just using that one feature at a very low threshold.
Threshold is key here. We’ve identified 2,580 scribbles which would have assumed they’re all bad signatures. Maricopa identified 0.031%, which is what that 587 represents. This actually represents four times and we’ll come back to that. Finally, Maricopa has 9,589 more net EVB envelopes that was submitted to signature verification versus what we have.
And we’re also going to see shortly, is that we’re going to see that we also saw through further analysis, a 25% surge of duplicates in the last six days, between November 4th to the 9th. We also saw some very interesting other anomalies, where blanks of duplicates were being stamped, “Verified and Approved”. We also saw stamps of “Verified and Approved” in blank signature regions, and I’ll show you the, share with you those.
What’s more interesting I would consider as potentially a critical anomaly is that we saw the “Verified and Approved” stamps appearing behind the envelopes. And I’ll show you this it’s almost as though it was imaged on there or I don’t want to say you know, Photoshopped, but put on there, but it’s quite fascinating. I’m sure there’s some explanation for this.
And then finally, we have cases where we have two different voter IDs, having the same address, same phone, same name with matching signatures. Let’s go look at some of these anomalies a little more graphically. Anomaly One; Maricopa reported only 587 bad signatures. To give you some idea, that’s 0.031%. And for people who didn’t really like math, I thought I’d make it a little bit more pictorial.
This would be 1 bad signature for every 3,268 early voting ballot envelopes. If you think about 1,918,463, early voting ballots, the unique ones that Maricopa said they got and for each paper by the way, size of the papers 0.1 millimeter and where to put them up, you’ll get 630 feet, that’s the size Believe it or not, it’s of the St. Louis Arch, that height when compared to the bad signatures will be only about 2.31 inches. So just to give you a pictorial understanding, a very, very low percentage was considered that, so again, that would be, to be specific 0.0306% percent of all EVB return envelopes were deemed as bad signatures.
Okay. In our case, the anomalies we discovered 2,580 bad signatures and scribbles again have very low tolerance, which is 0.135%. And to give you an idea just to state and again, we were not commissioned to identify we were commissioned to identify the present blank script was in signatures not to perform signature verification. Scribbles alone are considered bad signatures and EchoMail® itself identified 335% more bad signatures than Maricopa did from its entire signature verification process.
Okay, a very important point here. And by the way, if we go look back at the state of Arizona 2016 General Election, out of the 2 million ballots that came in, they had a signature mismatch rate of 0.13%, close to what we would have had if we just included those scribbles. All right. Again, just to restate, the scribbles are a very, very low threshold, we were not asked to do signature verification, again, anything between 0.1% to 1% – pixel density.
The third anomaly is again, focusing on signatures, we did a randomized analysis of just supervise review choose human reviewed is randomly looking at signatures, their legibility, and this is a wonderful analysis that can be done in handwriting four weeks before the election, and four days after, we can’t share with you this, obviously, but we found out four weeks before 95% were legible signatures, and only 5% were illegible.
But four days after 5% were legible. 95% were illegible. If I were to do a heat map. Again, this is a representation, it would look like this where the red represents eligibility. And the green represents legibility. It would look like this if you wanted to visually do this. If we had more, if we were commissioned to do this, we could do this. But you can see a market difference between legibility, there could again be an explanation for this.
Fourth anomaly. As the EVB’s the electronic voting ballots, I couldn’t get the word envelopes. But I think you get the idea increased by 53%. In the general election, 2016 to 2020 in Maricopa, bad signatures decreased. Let me explain what I mean by that. In 2016, which is, you know, in the previous election to 2020. 1,257,179 return envelopes were submitted 1,456 were considered to have a mismatch with their rejection rate at 0.116%.
In 2020, we had nearly 56%, sorry, 52% more return envelopes, but the signature mismatch rate goes down by nearly 56%. It’s gone down by a significant number here, up nearly four times. This went up, the other two came down. Very interesting inverse relationship, which would be again, one of the questions we have for Maricopa officials.
Anomaly number five is there’s no mention of duplicates in Maricopa CANVASS report. Again, if you look here, here’s the CANVASS report. There’s no mention of duplicates. In our case, we found 17,126 voters sent in two or more ballots as duplicates, as you’ve seen before. Anomaly six is 25%+ or more of the duplicates that came in during November 4th to the 9th, essentially on Election Day and after. How do we find this? Well, here’s a plot. Again, let me walk you through this, on the y axis is a number of early voting ballot return envelopes and we’re plotting it by date. And again, this is based on the drive we got which are a timestamp by particular days.
It almost looks like an interesting heartbeat here. On this day, 10/14, nearly, you know, 200,000 envelopes came in, on this day, nothing came in. So, you get this interesting heartbeat signal; then what we did was, we said, why don’t we on top of this layer in the blank scribbles and duplicates. Again, since the axis for the EVB is higher, that’s on the right axis. On the left axis, I’ve done the totals for the scribbles, blanks, and duplicates.
What’s fascinating is, you notice in the early part, things are following the heartbeat. Okay. But somewhere along here, particularly on October 26th, this heartbeat starts sort of separating, particularly the sacrum line is a duplicate. So, you have the duplicate stop matching in many ways the heartbeat. Here, everything stopped. What we did was we did further analysis on the duplicates, and the EVB RE. And this is what’s interesting.
What we’re plotting here is the duplicates as a function of the daily percentage of EVB’s. Here’s a signal. But what you see is, you see suddenly this- what you may call a significant growth in percentage. In fact, on several of these days, there’s 96% of the ballots that came in on two of these days are duplicates. There was a serious number of duplicates.
In fact, the area under this curve is close to 30%, 25 or 30% of the duplicates came in between November 4th to November 9th. And that motivated us to also, on the same plot; drop the blanks in the scribbles and you see the same phenomenon there.
The next anomaly is we all also noted that the EV33 system, which is the system that contains all the early voting ballots, had 9,382 voters who submitted duplicates versus a 17,126 that we identified. But what’s fascinating is when we match those 9,382 voters against our 17,126, only 2,138 voters match.
Let me repeat that, again, the EV33. As we understand, we’re not experts at this, is a system where all the early voting ballots are stored. When that was gone through and found that 9,382 of those were voters who submitted duplicates, clearly this should be a subset of ours, because we’re supposed to have all the early voting ballots, we found out only 2,138 matched.
And let me just walk you through some of these duplicates, where the duplicates are stamped, and also approved. You see the duplicate, and here it’s being stamped, nothing on here, it’s being approved. Duplicate one, duplicate two, this one is being approved. And again, all of these came in after November 4. Saying here, another duplicate being approved, and a blank. Here’s another one, example three, two duplicates, the blank being approved. Same here, fourth example, and there are many others, but these are pure blanks, and they’re being approved. In fact, this is a three copy duplicate, where two are approved. One voter sent in three copies and two got approved. Another one is three copy duplicates, where one is approved. The next anomaly is verified and approved in the blank signature region. What do I mean by that, so this is blank, but they’re “Verified and Approved” as appearing right in the blank. And this is a process issue, which we’d love to get answers to. Same thing here. Same thing here, the “Verified and Approved” is right there.
And then finally, this is an interesting anomaly where we have two EVB’s, where people have the same name, address, and phone number with matching signatures with two different voter IDs. Okay, we had to redact this. But imagine if you could see this, there’s a person’s name here, and address, which is the same as the name and address here. Very similar matching signatures, same phone numbers, but they have two different voter IDs. So again, in two different voter IDs, same name.
Matching signatures as, if you look, if you looked at the visually same phone numbers, another example here. Actually, there’s three examples here. Another person here, we call, by the way, this is… there’s not a person called John Doe, it’s, this is just preferred to protect the innocent. And this is Jane Doe, again, matching signatures, same address, same phone number, but two different voter IDs. Same here, two different voter IDs, same address, same name, we don’t have a phone number here, etc. We have three examples of that.
Verified and Approved
Then finally, the last anomaly I want to show is where we saw something fascinating, is where the “Verified and Approved” stamp; And you have to look at this carefully, it’s occurring behind the envelope, trying, let me explain. If you look at this carefully, this is an image of an envelope. Here is a triangle which is pointing people to print here. Now you would think if it was stamped, the stamp should be over this image, but it is actually behind the triangle.
You see it here again, close up here, these are all different ballots, which were approved post November 4, predominantly where the image of the “Verified and Approved” is behind this. Now maybe this is done for a good reason. Maybe it’s an imaging technology, but typically you could, you know, if you use Photoshop, you’d have layers, but I don’t want to even accuse that, but I just want to say that it is interesting that being “Verified and Approved” is behind the envelope here. The envelope triangle.
Questions for Maricopa County
Alright, there’s some examples here. Questions, these are the questions we have for Maricopa officials. One is- Did Maricopa receive any duplicates? Again, I’ve gone through, we received, we have in our possession 34,448 images representing duplicates from 17,126 unique voters being 2-copy, 3-copy, and 4-copy. The word duplicate does not show up as a keyword in their report, but it would be interesting to note duplicates exist. The second one- Is there no reason that EchoMail® has more no signatures than reported by Maricopa. Is it because we solely analyze only the signature region and if not, why?
The next question is- Why did EchoMail® detect more scribbles in Maricopa’s reporting of bad signatures. Again, this comes to the point which I probably EMPHASIZE ENOUGH here. But if our scribbles at that less than 1% to 0.1%- pixel density were considered bad signatures, that is significantly more than three to four times more than Maricopa’s 587 bad signatures.
And the other question of the date stamps in the directories is that we have the date on which the EVB return envelopes are received by America officials, we have assumed that when we did our time, temporal charts. And then finally- Why does the approval stamp “Verified and Approved”, a period exists only on a small subset of the EVB return envelopes.
Out of all the 1.9 million that got “Verified and Approved”, we find most of them exist after 11/4, after election day, but very few sprinkles. Out of all those envelopes. Our initial supervisory review reveals maybe 10% have the stamp on them. The other thing is- Did Maricopa stamp some EVB return envelopes as EVB approved, even though the signature is blank, since they found the signature elsewhere. And that would be good to know what, how did they do that? What is the adjudication process?
SOP and Chain of Custody
Finally, all of this leads to a very important set of things from an engineering standpoint. What is the standard operating procedure? We call it an SOP. For the EVB processing. In any Engineering System, you have the SOP’s, because again, we’re relying on this very, very important process of the signature and its verification. What is the SOP for the signature verification? And what is the SOP for curing questionable signatures? And finally, what is the Chain of Custody? We’ll talk a little bit more about that.
Surge in Duplicates
And the last set of questions is: Why was there a sudden surge of duplicates during 11/04 to 11/09, which is incongruent with the trend, we were seeing the early voting ballot return envelope counts? Finally: Why is the “Verified and Approved” stamp envelope appearing behind the printed envelope triangle? How does that happen? Is there some imaging that’s done? Are the envelopes printed? I mean, it’s very, I just have a question from an imaging standpoint.
And the other question is: Can two different voter IDs be associated with the same person, the same address, ID, matching signatures? Is that allowed? And then finally, why are blanks being stamped as “Verified and Approved”? And then more importantly, why does a stamp “Verified and Approved” appearance in a blank signature region? Those are our questions.
Precision, Verifiability, Reliability, Auditability, and Reproducibility
In conclusion, as I started this conversation, the EVB return envelope is a container of the ballot. It’s a very important thing. I mean, if you think about the human body as a system, your skin is what contains you. That envelope is what contains these ballots at a very, very important part. There’s significant opportunities to enhance Precision, Verifiability, Reliability, Auditability, and Reproducibility. In the world of Engineering Systems, we call these Properties. And these anomalies give us a wonderful opportunity to enhance these. At least these five attributes. We believe what needs to happen is at the conclusion, is a signature verification process.
Unverifiable, Opaque, Non-Transparent needing a Full Signature Verification Audit
No pun intended, it’s unverifiable. Okay, we can’t really verify this process. And there’s a lack of Systems Integration and reporting. Example, the EV33’s, which should have all the early voting ballots. Just on the duplicate issue, we haven’t had time to do a full Systems Integration, that wasn’t the scope. But even on the ballots, we have far more duplicates than what are even in the System. And currently for, to us, because we haven’t been able to get access to the standard operating procedures that is opaque and non-transparent.
The future research, we believe that is absolutely necessary is we need to do full signature verification audit, which means do the full 27-point analysis. We have the capability right now. We have everything imaged. If we acquire Maricopa’s SOP for signature verification; if we can get their 27-point analysis algorithm; we can replicate all of this and using the algorithm that they have, define an actual false positive, false negative, error rate.
What I mean by this is, this would be a profound opportunity for improving US Election processes, because this has not been done. You read literature on the Left or the Right, everyone complains that the signature verification process has significant issues.
We have an opportunity right now with this data and the opportunity here to do a 27-point analysis and really come up with an actual rate, which would give us a Scientific Metric of the Confidence Value of the entire EVB system. And finally, we need to review the Chain of Custody. Today, what happens is when a signature is, there’s a questionable signature, people call the person, they contact them, and then they have some conversation, which is then verified. Where are those conversation records, are those tickets stored anywhere? And do we have access to them? That’s it. Thank you, everyone. Thank you very much. I’ll take questions if there are any.
Senator Peterson: Dr. Shiva, just before you get off here. I think it’s really important that we know your credentials, you kind of just breezed through that slide, can you go back to that real quick and just go over your credentials for everybody. So we that’s important for us as we’re reading your report.
Dr.SHIVA: Sure. Appreciate the opportunity to share that when we bring it up.
So let me just give you a little bit about my background. I have four degrees from MIT, a PhD in biological engineering in what’s called Computational Systems Biology, which is all about doing computation, recognizing patterns. My master’s is also from the MIT department, mechanical engineering, right?
That computational wave propagation, to look at a very important area of pattern recognition called non destructive testing, where you’re looking, you don’t want to actually open up a bridge or you don’t want to open up the aircraft when you’re sending signals in and you’re classifying them. My other degrees from the MIT Media Lab. In scientific visualization.
My master’s Rael also uses the same techniques to do some of the earliest complex visualizations for classification. And my bachelor’s is in electrical engineering, computer science, also from MIT, where I built for that one of the first cardio cardiology systems for doing pattern analysis cardiology signals.
My focus for 40 plus years has been in this field of pattern recognition and classification in biology, medicine, engineering, aeronautics, civil, electrical banking, and finance, military, across a range of areas, handwriting recognition on bank checks, email analysis.
As a graduate student, I won the White House competition for automatically categorizing the White House’s email. I won. I was the only graduate student asked to participate: in 1993 when email became a consumer application before it was a business application; the email that I created back in 1978; 1993 was when email became a consumer application. The Clinton White House was getting tons of emails; they wanted to analyze it automatically.
I ended up winning that, left MIT, took a 10-year hiatus and built echomail for pattern analysis, recreation, recreation emails, did a lot of work for many years as an undergraduate helping deaf-blind analysis of signature pattern analysis there.
Currently, I work in a company called CytoSolve, where we’re looking at analysis of signals and biomarker signals to figure out the right combinations of medicines. That company actually got a multi-combination therapy allowed by the FDA for pancreatic cancer.
I’ve written a number of books. If anyone wants to go to VASHIVA.com – you can look at my biography over there. I published in the leading journals in the world – Nature Neuroscience is one of the eminent journals in the world, Cell Biophysical Journal, Tripoli – these are high impact peer reviewed journals.
Then the Copyright Office delivered me the first copyright for the invention of email. I’m a Fulbright Scholar, a Lemelson MIT finalist, I won one of the earliest Westinghouse Science Honors Award and was a nominee for the National Medal of Technology and Innovation. And I’ve been invited to give distinguished lectures at the National Science Foundation, and FDA. On November 19, I delivered the Pristine lecture on the immune system and was invited to lecture at the NSF where we discussed the immune system. And several years ago, MIT had me deliver their presidential fellows lecture.
Senator Peterson: Thank you for sharing that. You’ve raised a lot of very important questions for us to get answers to.
President Fann: Thank you, Dr. Shiva. Your report was extremely insightful. And considering the fact that we only got the envelopes just a few short weeks ago. We appreciate you dropping everything, all of your other responsibilities, and jumped on this right away so that this could be accomplished by today’s hearing date. So we appreciate that very, very much.
Dr.SHIVA: And I also want to thank Doug Applegate, and Phil Evans, two of my colleagues at echo mail, and the echo mail team, in particular, all the stakeholders. And again, I want to thank the courage of the leadership of the Arizona State Senate, this will go down in history as one of the most important engineering events, not just an election event, it will go down as a very important engineering event for engineering systems of election voting systems. I really appreciate the opportunity, and I’m very honored to support this effort. Thank you.
President Fann: Thank you, Dr. Shiva. And for everyone in TV, land, or whatever. These reports will all be made available in their entirety. They’ll be uploaded on the website as soon as we can get them uploaded. So you can not see all of this personal information. And the personal Yeah, just make sure that the redaction that you saw was any personal information, we tried very hard to make sure we complied with the court order, and make sure anybody’s names and addresses that were on there were not available to the public. However, the unredacted version will go to the attorney general’s office so he can seek further investigation on these anomalies. Thank you, Dr. Shiva.
Dr.SHIVA: Thank you very much.
President Fann: All right. Thank you so much. And just also a note, it was interesting. We had received a lot of emails, affidavits, you name it from people that actually worked at M. M. Tech and at the polls that they had told us that when it all started, what is 27 or 29 points of signature verification? 27 Thank you. That’s what you’re supposed to do is 27 points of signature verification. And at some point, it went to 20. It went to 10. And towards the last few weeks, we were told that they were told, just stop checking signatures, we’ve got to get this done. So what he’s showing us here does, in some sense, correlate with the things that people had told us so for what it’s worth.