Video Transcript
Christine Owen:
Welcome to another episode of Identiholics. My name is Christine Owen, and today I have the honor to be able to talk to Maria Vachino. She is an expert in identity, and she works really closely in the federal space, but she also does a lot outside of federal. She's a, I would say, small but mighty person who really gets a lot of cool stuff done in the government. So Maria, thank you for coming today.

Maria Vachino:
Thank you for having me.

Christine Owen:
So I want to know, how did you end up where you are today? Because today you have your own business, and you're consulting for a lot of really cool people. How did you end up here?

Maria Vachino:
So actually I got into the identity space while I was at Johns Hopkins University Applied Physics Laboratory working as a computer scientist and software engineer, and I started to become really interested in the challenges with interagency and interorganizational collaboration. So this was back in the days when FaceTime meant meeting people in person. So it was really, really difficult to know who you were communicating with. So what people did was they didn't communicate, and it was really challenging. And I saw the impact in the disaster response challenges with not only interagency coordination, but coordination with state and locals and with other people who needed to have information very quickly. How do you do that? How do you authorize people and still keep out journalists, but also potential terrorists from getting highly sensitive intel that you need to coordinate disaster response. And the other issue that I was seeing with collaboration was the barriers it created for scientific collaboration.
There was a lot of really good deep research and I'm a complete science geek, even though I'm not doing hands-on science myself. That's what I originally wanted to do. And I studied biology and chemistry, so I enjoyed talking to the actual working scientists at the applied physics lab. And I was like, "Why don't you just have a platform and coordinate with these people from these other universities?" Well, they're like, "I don't know if I'm talking to Dr. Jane Doe in quantum physics at Harvard or Jane Doe the freshman who's a part-time spy." They just didn't have the information they needed at the time. And so then HSPD 12 came out around this time, and so DHS S&T cybersecurity director, too many letters, had an identity management research program. And the identity management test bed at the applied physics laboratory. And so I had the wonderful opportunity to join the testbed and join that program.
And I stayed there for many years supporting them. And they were starting with the Federation problem and the PIV cards and the CAC and the TWIC and the FRAC and PIV-I, the whole spectrum of the HSPD 12. And then not only authenticating across agencies, but also how do you authorize? How do you know that this is someone who can have the sensitive information? How do you do that quickly? So looking into that, into all the authorization problems, the policy, the Federation, the technology, and really just looking at every aspect of how do you enable interagency and cross-organizational collaboration and access control. And as part of that, we ended up expanding out into other things after doing years of that type of research. And one of the programs that I ended up working on was the Know Your Customer Law came out. So the White House had an initiative with a lot of interagency collaboration on how do you help the financial institutions actually implement that?
How do you know who that person is who's applying for a bank account, and make sure that they are who they say they are. And this is an unnecessarily challenging problem in the United States because of our federated system and all of the barriers. And I really, really got a deep appreciation for that. We came up with a profile, actually the exact almost standard. We had a technical implementation in place, a prototype at the applied physics lab. We were so proud of it and we're like, "Now we just need AMVA to allow us to do photo matching. We'll just send the photo. We'll send all the information on the driver's license. We'll get back a yes, no, perfect. They'll do a selfie."

Christine Owen:
What year was this Maria? What year was this? It was a long time ago.

Maria Vachino:
It was a long time. It was a long time ago.

Christine Owen:
Because HSPD 12 was 2006. Know Your Customer was, what, 2000? It was in the early teens.

Maria Vachino:
It was in the early teens, yeah. So I've been in this space a couple years now.

Christine Owen:
You're not a day over 21.

Maria Vachino:
That's right.

Christine Owen:
No, yeah, so you were having issues with AMVA and face [inaudible 00:05:30]

Maria Vachino:
And then SSA as well. We're like, "Well, we need to know that at least this data matches a real human being in the United States." Right? And somebody just didn't put together a string of numbers and say, "This is my SSN." And so we tried to get SSA to do it, and of course SSA said, "Sorry, wish we could. We're not in the business of identity management. This is not a national identifier. We're not allowed to. Sorry." So after many years the financial institutions worked with Congress and the banking bill was passed. And SSA was then told and then also given the authority to be able to do this and be able to help banks with the Know Your Customer law, and do that SSN named date of birth matching. And so then a friend and colleague of mine had already left the government, Paul Grassi, and he said, "I am working on this problem that vexed you so many years ago at SSA. Can you come join and help design and run the federation part of the solution." Which was using OpenID Connect.
And so I left the applied physics lab and started working with him doing other things as well, supporting GSA and some other agencies. But that's the problem that pulled me out of the applied physics laboratory and got me started away from the prototypes and the research. We did have some deployments, but really supporting agencies more directly with the citizen challenges, which has been fascinating. I mean really there are so many challenging problems with organizational identity, with allowing businesses access to agency services securely, and citizens supporting the full gamut of people who don't know how to take a selfie, to people who are much more sophisticated. And so it is really a fascinating space within cybersecurity. And I really enjoy working in it because it allows me to basically do good in the world, which I [inaudible 00:07:46], trying to protect people from fraud and from having their PII and sensitive data stolen, their money stolen, which can be really devastating for people. And I also am a big fan of convenience. I have no patience, and so trying to get things to work faster and more conveniently is always something I enjoy.

Christine Owen:
Yeah, I think that's really cool, especially being in the identity space since HSPD 12 was passed or was created, which for those of you who are not government acronym geeks, HSPD 12 was essentially every government worker has to have a card that has their identity on it, and it's essentially a PKI card, right? All the way to helping SSA today with some of their biggest challenges is pretty cool. The thing that's really interesting though is Know Your Customer that's been around for a while, and the idea of face matching that's been around for a while. Back in the day, I remember in 2015 we realized that the technology just wasn't there for accurate face matching, right? And now because of the pandemic and the rise of really rapidly communicating with not face-to-face and also needing that biometric face matching, I think we have a lot stronger of a biometric story for a lot of the vendors today, which is really good.
And in fact, I have heard that we are going to get a study out pretty soon that says that biometric matching actually is pretty equitable and does not discriminate against any sort of racer or issues in that case, which is really good because in the beginning it really was tailored towards a certain type of person. So with the rise of biometrics, some people are very scared of it, some people really like it. Since you like convenience, I feel like you're probably more on the I'm into biometric side. How do you feel about the use of biometrics for identity proofing and other matters, and then what do you see in the future or even today as what we're going to have to start thinking about with biometrics?

Maria Vachino:
I think it is absolutely critical to use biometrics for any higher risk applications, and I also think it's far more convenient for people. The fears, as you mentioned around biometrics. They did start when there was an equity issue because this was a field where a lot of the people who were creating the algorithms and the implementations were using and training on photos of people similar to themselves. So this was actually a pretty easy problem to fix. If you train on faces that look like the broad swath of humanity, then of course the algorithms are going to be capable of matching faces of all types, right? So that was an issue. The other issue was of course, the lack of competition. As competition grows in a space, companies are more inclined to do better. And part of that, I really believe that a big driver was the NIST facial recognition algorithm testing program, and publishing that and showing which companies and which algorithms were at the top, what the flaws were, what the rates were.
I think that really, really rapidly drove the technology in this space, which is fantastic. I mean, testing programs overall, I'm a huge fan of, [inaudible 00:11:30] associated with Kantara, was their VP of assurance for a while. And I've been involved with FIDO, they have a fantastic testing program and the OpenID Foundation, I mean, testing really drives accuracy and it improves technology far more rapidly, especially when the results are published like that. Another issue with the facial recognition is that people tend to conflate the one-to-many with the one-to-one. These are completely different circumstances. So when you're talking about identity proofing, all that you're doing is you're matching your face, your selfie, against an authoritative photo of yourself, ideally, could be a bad actor with an authoritative picture of you. So that's what we're trying to avoid. What they use in law enforcement is completely different.
What they're doing is they're looking at a ton of photos and they're trying to find out who someone is, and that is a very, very different procedure. And I think it's that conflation of those two very different use cases and different matching types that has made people so nervous. Because we live in an era where our faces are everywhere. I have a hard time believing that your typical citizen really cares about taking a picture of themselves for the most part. And there are cameras all over our cities now, which can make us much safer. People have Facebook, they have LinkedIn, I mean, our images are everywhere. So I don't know that that's really a barrier. And the fact is when our information has all been stolen, it's constant breaches. I get email notices, I swear twice a week that my data has been stolen in another breach because people are really bad at cybersecurity still.
So everyone's date of birth, social security number, email address, full name, mother's maiden name, of course, my kids, they have my name. I mean, the mother's maiden name is archaic. I mean, not everyone even uses it. So there are all this data, your place of birth, your soul, all of this information is PII, is all out there. And some people just post it publicly as well, these answers, frankly, to a lot of the knowledge-based questions on their social media profiles. So we're living so publicly, and our personal data has been breached so many times that really you need to have biometrics to bind the person to the data they're presenting. There are other techniques that are a little bit more difficult, especially in the United States because we don't have authoritative records of anything, it's used for identification. Phone number is associated with your name. You didn't have to show proof of identity to get a phone. You definitely didn't have to do social security number to get a phone and date of birth.

Christine Owen:
Yeah, but what's really interesting is phone number is not always associated with your name because if you were-

Maria Vachino:
And it's not, correct.

Christine Owen:
If your kids are on your phone plan, then their phone number is associated with you, right?

Maria Vachino:
Exactly. Exactly. And some people can't afford the regular phone plans and they have the prepaid phones, and then there's mailing address. Well, mailing address, again, it's associated with just name. It's not with date of birth, it's not with anything else. People move. I mean, you can use these techniques. These are super useful techniques using authoritative addresses, but they're not really authoritative. So the only thing that truly allows you to bind to the person with a really strong degree of confidence, it is biometrics. And we can use a lot of other anti-fraud techniques and a lot of other things to reduce the likelihood without biometrics, you're never going to achieve the same level of confidence that you're going to achieve with biometrics.
Now, one of the issues still with the biometrics is that, I mean, we're using our driver's licenses, so they've got a little itty bitty photo, poor resolution. It wasn't taken with the intention to use it this way. The DMVs aren't making sure that that photo is of such high quality that when you take a photo of that photo and then compare it to yourself that it's going to work all the time. So that's an issue. And so it's not the algorithms that are the problem. The algorithms are fantastic. It is the fact that we are using such a poor quality authoritative photo when we're doing the biometric match for remote identity verification right now.
So that's one of the reasons I'm so excited about the mobile driver's licenses because they have the potential to allow us to do really high fidelity biometric matching in a very privacy preserving manner on the device itself. Because you're going to have the photo of the person, very high quality on your device. It's going to be digitally signed by the issuing authority, and you're going to be able to take a selfie, and ideally do that comparison, that match, on your phone. So all the privacy advocates should be happier because the photo doesn't have to leave your device. And if it's implemented correctly and securely and red teamed and everything else, excuse me, then it has the potential, at least in the United States, to be revolutionary when it comes to remote identity proofing. So I'm super excited about that.

Christine Owen:
Are you working on the NIST mobile driver's license or the NCCOE mobile driver's license group?

Maria Vachino:
No, I'm not working on it, lack of bandwidth, but I'm following it.

Christine Owen:
Yeah, I'm also following it. It's pretty interesting. I think one thing that's really interesting about that is that the next thing that they're trying to do is they're trying to figure out how to take that mobile driver's license and create a credential that you can use for authentication purposes. While on the one hand, I think that having an authoritative source, like a mobile driver's license on your device is going to help us a lot in being able to verify that you are who you say you are. We still need to take that and bind it to a strong credential. Yeah, that's something that we're working on, and talking about and educating people on.

Maria Vachino:
Yes, and it's also useful to have more [inaudible 00:18:30], having additional information about the device itself, about the phone number, about the origination of the request, location, I mean, there are all these other factors. And IDs are stolen, so then what do you do? So it is more complicated than just using an MDL in order to gain access to services securely.

Christine Owen:
Yep, I totally agree with that. And on top of that, I mean, if your phone is taken and you don't have strong authentication on your phone to get in, then someone could actually easily use your phone to become you, right?

Maria Vachino:
Yeah, that's an excellent point. A lot of people, they don't have any mechanism to lock their phone. They just leave it open or they have a simple pin number or just their face, which can be vulnerable to attacks with using a photo of someone to unlock their phone. I mean, it is definitely not sufficiently secure to just use the phone. I mean, multifactor of course means more than one type of factor, and the factor is the biometrics. You have to have a device, something, a token ideally, and then a memorized secret. So you still need that memorized secret in order to protect things.

Christine Owen:
Yep. I totally agree. I'm really, I am very interested and excited about MDLs because I think that MDL is one piece of a broader puzzle. Obviously I'm really into verifiable credentials, that makes a lot of sense for everybody. But I think that, like you said, MDL is one piece of that broader puzzle because I think that verifiable credentials takes that MDL and then all these other risk signals and all these other factors and creates a more holistic view of that person to create a verified identity. I think that's really important because just using one data source isn't always the best solution to validating either that that person is who they say they are, or even validating that that person has a valid authenticator and should access what it is that they're trying to access, right?

Maria Vachino:
Yeah, I mean, access control is a really challenging risk management problem, and you really need to understand the risks of inappropriate access, and then protected appropriately. And so if you have something that's fairly low risk, maybe the MDL is sufficient. If you have something where if someone else were to impersonate you, it's going to have a serious impact on your life or on your business, then you need to expand beyond that to include a lot of other information, the anti-fraud information, the shared signals, you have to create a more in-depth solution and complex solution to be able to solve the higher risk problems. Absolutely. Now, when we talk about verifiable credentials, I mean, are you talking about the, I mean there's the W3C specification, and then just people use the term more generally. So which one are you referring to?

Christine Owen:
I am into the W3C. I am a very big standards-based person. I totally agree with you that using NIST and other large bodies like FIDO and Kantara to be able to test that you are matching either standards or at least whatever best practices are, is really important. Because any schmo can go out there and create a product, but you want to make sure that it's a good product. So you need to have all of that testing and you want to make sure that that product not only is good, but also interoperates really easily with other products. So you want it based on standards. I think those are really important for any organization trying to decide, "Which product should I go with?"

Maria Vachino:
Yeah, yeah, those are critical. Absolutely. Now, currently in the W3C verifiable credential space, there are a lot of different options for standards. So we're going to need profiles, a lot of profiles, a lot of options. It's very complicated. And of course the MDLs are based on the ISO standard. So I think also of the whole space, as you have a number of different standard options, to do the same thing. So we do have a lot of different flavors of standards in this space. It's sort of like Federation.
You can use SAML, you can use OpenID Connect, you can use... there used to be more standards that people used regularly, but the two most people use now are SAML and OpenID Connect. And I think it's going to be the same in this space. These are user-controlled credentials or user controlled verifiable claims or verifiable attributes. And it's an umbrella term for things that allow the user to be able to have some sort of authoritative copy of their information in order to share it with relying parties who need it, rather than going out and requesting that someone else verify the information for you.

Christine Owen:
Yeah, I totally agree with everything that you said. I look forward to the day when I have a wallet that has my verifiable credential in it. And then some sort of relying party, for example, a thing that I'm shopping, perhaps ask for verified information or just for information in general. And all I have to do is shoot over my address without entering it. Boy, wouldn't that be fun and easy, and make my shopping habit even worse? No, I'm just kidding. But I think I totally agree. There is one group, and I cannot remember the name, that's currently working through all the different standards that are out there that work together but aren't necessarily the same, like FIDO, W3C, ISO, all of those, they're all adjacent and then sometimes slightly overlapping, but they're not the same.
And so this group is trying to figure out not only how to utilize all of them together, but also how to ensure proper federation when you have a mobile driver's license that needs to talk to a relying party to get a FIDO credential or something like that. So it's a fun space that we're in, I think, we now, because while today we don't really know what the future is going to be. We know that the future is going to be a combination of the things that we're talking about right now. We just don't know exactly how it's going to get adopted.

Maria Vachino:
Yeah, I can't remember the name of the group either, but the European Union is doing a lot in this space and doing the documentation and making progress on mobile wallets. And they have requirements around what the mobile wallets can do and the security. And so that space is moving along much faster thanks to what the EU is doing, I think. And it's going to be complicated. I mean, we're going to continue to have a number of different standards around the credentials themselves and the authentication techniques, and the federation techniques. And so I think that over time you're going to start to see the broker model grow and become stronger. So that when you need a bunch of different information from a person or about a person or about a business, that they can reach out and they can pull in the mobile driver's license. They can pull in the verifiable credentials that have your certifications and other things that are used for authorization, for example, or maybe even your social security number.
So pull that information all together, and then reach out and look for the fraud signals and all the shared signals around suspicious use, and create a package that they can send along to the relying party. So I think there's a lot of potential with that model for the more complicated use cases that you tend to see in government and higher risk situations. But for citizen use cases, most of them, I think the mobile driver's license is just going to be fantastically helpful once we have that DASH7 ISO standard in place and people have started the implementation. Right now, what you have is because you don't have that standard for the logical access control use cases, you have companies doing what they do in developing the proprietary solutions. Apple has their own solution to share your birthdate and your home address authoritatively from your driver's license. So that's going to take some time, I think, for us to get a solid standard in place there that everyone can use.

Christine Owen:
Yep, I definitely agree with that. So first off, one thing I've been thinking this whole time when you were talking about brokers is connect.gov was just a little too soon for the time. Wouldn't it be nice if we had that today? Actually, HHS is trying to create that with XMS. And in Canada and the UK are both working on that kind of broker exchange program, which I think [inaudible 00:28:27]

Maria Vachino:
I think we need that in the US as well.

Christine Owen:
Absolutely. Because in the US, we're not going to have one authoritative digital identity. It's just not going to happen. We'll never have it. We have no paper identity that is authoritative, so we need a broker.

Maria Vachino:
And there've been efforts to do it. I was supporting Phil Lam when he was a GSA, and he had the great idea to do that as well. And these things, it's really, really hard to get off the ground. I mean, they're not technically complicated, but the way money flows in the government is super complicated. How you fund things, and the interagency agreements are super complicated, and all the issues with the Privacy Act. And there's so many just governance and legal and institutional and barriers in the US to getting these things done. So it's not the technology that's a problem. It's relatively easy to design these technical solutions. This is definitely not rocket science. Designing the technical solutions is not the issue. It's all of these other barriers that really need to be addressed.

Christine Owen:
Yeah. Personally, I think that the best way to do this is to start out in the private sector and then have government adopt it, just because that's... the biggest advances in government tend to happen that way. We used to have where government started the technology revolution and then private sector adopted it, but we're not in that world today unfortunately. But I mean, it's a good thing though. There's a lot of really smart people in private sector who can definitely solve this for us.

Maria Vachino:
Oh, absolutely. The only issue is that the private sector doesn't have the authoritative data. Now, MDLs will help to change that and verifiable credentials, but while they're locked away in the government, that's what's really creating barriers today. But I agree, the private sector is much, much faster.

Christine Owen:
Yeah. I don't think it's going to get any easier for the authoritative data to be shared in the government anytime soon personally. But maybe I'm wrong. Let's hope I'm wrong. I don't think I'm wrong. I remember in 2015 when we tried to unlock that beast.

Maria Vachino:
I don't think [inaudible 00:30:47].

Christine Owen:
It's been five years or 10 years almost, and we're still not there.

Maria Vachino:
I try to hope and that I give up hope. It's sort of the cycle.

Christine Owen:
Yeah, no, it's totally true. It's totally true. It's unfortunate. So we talked about the barriers of adoption of these things for tomorrow, not today, but what's the things that are the scariest to you today?

Maria Vachino:
I mean, today the issue is that we don't have a great way to identity proof people remotely as it is because of the issues with not using biometrics. And because of the issues around the driver's licenses. I mean, especially for sensitive applications, people are motivated to just get a fake driver's license because we can't go back against the authoritative source for that photo. That's a huge issue for me currently. Our data's out there, and even though the data's out there, not everyone is using biometrics for things that I think should require biometrics. And then of course the biometrics themselves, it's not the biometric that's flawed, it's our way of implementing them. So I think until we get to the place where either we can have the MDLs available to us or AMVA or the Department of State opens up the authoritative photo and allows the matching. We're going to still struggle with that, at least for remote identity proofing. Moving forward, deepfakes are certainly a huge concern for me. We're already at the point where voice is no longer usable for biometrics.
And a lot of our interactions are still over the phone where people are allowed to call up agencies, for example, and say, "This is my name, my date of birth, and my SSN. This is my address. Let me change my bank account. Let me do this. Let me do that. Give me this information, mail it to me here. Let me change my address." This kind of thing is happening. And before there was at least some small barrier. If you call up and you're a 20-year-old kid and you say, "I am a 75-year-old grandmother." The voice, it's going to raise some red flags with an observant telephone operator. You don't have that anymore. You can sound like anyone, and pretty soon we're going to have the issue with video. I mean, we already do to some extent, it's going to improve rapidly. And so we're going to soon get to the point where in real time, you can have an interview with me and it's not me, and it's going to look like me and it's going to sound like me.
And you're going to ask me questions, and I'm going to be able to respond in real time, well, the image of me and my voice are going to be responding in real time, and it could be someone else entirely. And what do we do with that? And right now, that's our backup for identity proofing. We're doing video identity proofing, and I think it's okay right now, August 2024. I don't think it's going to be okay, August 2025 to be able to use video for identity proofing anymore. And this is going to create a barrier because right now video has proven to be extremely useful for people who are not technically savvy with taking the selfies and need to doing the things with their phone. I mean, it has allowed increased equity, quite frankly, for remote identity proofing digitally for a lot of people. And so what are we going to have in place of that?
So I think we're going to have to go back to some in-person proofing. And then also using things like kiosks, like NextGen ID, for example, has a kiosk that Kantara recently certified as going up to IAL-3. And so you're able to go in, and so then it also works for IL-2, identity assurance level three and two for listeners who aren't familiar with all the acronyms, I'm trying to remember. It is a very high identity assurance and then moderate levels of confidence in someone's identity when proofing them. And so having that in-person element where it walks them through, and then you can have a secure video feed, that might be very useful. Because I don't know that we're going to be able to keep up, especially with the fact that people have the ability to use their own devices all the time. It's such a variety of devices that it's really going to be challenging to be able to monitor and detect citizen BYOD manipulation of video feeds. So those are my upcoming concerns.

Christine Owen:
I totally agree. So an interesting point of that is that that is already happening today, but it was used benignly. So in Indian elections, the PM candidates this year used essentially deepfakes to be able to talk in a very relational way with their citizen constituents. Where they could go to a booth and talk to basically a surrogate of the candidate who had a skim on that looked exactly like the candidate themselves, very fancy sound like them, everything.

Maria Vachino:
It was fascinating. Yeah.

Christine Owen:
What a cool way to use technology. And then we think only of bad things. So that's our problems, right?

Maria Vachino:
Right. There are really cool applications, and I think it's going to revolutionize movie making. I mean, there are issues with the actors and everything, but it's going to open up a lot of creative outlets. So they do have a lot of potential, but then they also have tremendous potential to do harm.

Christine Owen:
Yes. So what we're seeing in the commercial side right now to be able to combat against essentially people calling into the help desk, and either using stolen voices or just KBIs, is that they're actually asking people to vet themselves very quickly in real time. So they're sent a link and they're told, "All right, take a selfie, take a picture from back of your license, your passport, and prove that your information matches what it is that we understand you to be. And then we'll talk about whatever high risk thing, like changing your password or whatever." Now, why are we using passwords? That's a whole other story.
But that's really what is starting to come, not just from the commercial world. We're actually also seen in universities because high profile alumni have started to get their accounts stolen, and it's exactly the same problem. So it's a really interesting way to fix it. I think it's a good way to fix it. And this is, I think, one of the best use cases for the idea of verifiable credentials, which would be, here's a magic link, prove yourself with your verifiable credentials. Send me the information that I need. And then it's just a yes, no, "I don't really need to know. I just need to know that it matches, and then I can keep talking to you." Right? If you don't have it, then, "No, thank you."

Maria Vachino:
Yeah, yeah. No, I think that's going to help tremendously. Of course, underlying this one technology is a very old technology that people still have problems implementing, which is PKI, public key infrastructure. I mean, it's all PKI, started doing PKI back with HSPD 12, and the brand new stuff is all PKI based. But unlike with HSPD 12, you don't have a federal bridge where you can have one point where you go and you check to make sure that the certificate was issued by the authoritative source and it hasn't been revoked. And check the whole chain of trust and make sure nothing has been compromised in that entire chain. Make sure the certs still good. So there's this complication now because, well, it's PKI without the P. It's sort of more distributed public key infrastructure, and that's going to be a tough problem to solve in a way that keeps the security that we need for this kind of thing.
And of course, we also have to be ready to pivot because public key technology is all built on asymmetric keys, which are vulnerable to quantum machines, quantum algorithms. So I've been talking to so many people over the years, and some people say it might be 10 years, some people it's 50, some people it's 100, but no one says it's never going to happen. And so eventually we're going to have the problem where all of the public keys that we use today are going to be broken or breakable by quantum computers. So their use cases are going to be more limited, and we're going to have to... And NIST is working very hard on the alternatives, but it's going to be a matter of timing. Are we going to make that switch to quantum resistant algorithms and technologies before the general purpose quantum computers are built, or are we going to be late? So that's going to be interesting to see.

Christine Owen:
Yeah, I will say, so I know someone who is working on quantum computing, and they are literally smashing atoms to produce the energy needed to be able to do the computations. It is so cool. And they're actually working with NIST. So the good news is that NIST understands what the technology is becoming so that they can also combat what that technology is becoming.

Maria Vachino:
That's right.

Christine Owen:
All right. Well, I want to know, is there anything that you're doing that you think is really cool that you'd like to plug before we go?

Maria Vachino:
Yes. I have been working with NIST on a NISTIR that's going to be hopefully coming out for public comment soon. It's on attribute validation services. And so as soon as it comes out, hopefully people can look it over and provide us lots of feedback so we can improve it before publication. But we're hoping what the paper does, it serves a couple of purposes. One, educate people about the need for attribute validation services, the current options for implementing them, and also these future options such as the W3C verifiable credentials, the mobile driver's licenses, the other options that we're going to be having hopefully fairly soon. And also, it should also be useful for relying parties who might be interested in how to utilize these. It wasn't written for relying parties, but it still might be of interest for them as well, especially anyone who might eventually be interested in that broker model and potentially even becoming a broker. And so we'd love to get lots of feedback on that as soon as it comes out for comment.

Christine Owen:
Yeah, definitely let us know when it comes out and we will promote it and also review it, because that is what we're going to have to do. That's something that we definitely [inaudible 00:42:18]. That'll be a lot of fun. All right, well Maria, thank you so much. I could talk to you for ages, but I know both of us probably have other things that we need to do today, unfortunately. So thanks for being on with us and having a really awesome talk.