Video Transcript
Christine Owen:
Hi there and good afternoon. My name is Christine Owen, I'm the host here on Identiholics. I'm really happy that you guys joined us today, because I have two fabulous ladies with us. First off, I have Jamie Danker, she is from Venable. And I also have Carole House who is from Terranet.

All right. So, first up, I want everyone to be able to get to know you guys a little more. So, Jamie, tell me a little bit about your background, what you do now, and how you got to where you are?

Jamie Danker:
So, yeah, I'm at Venable now. My title at Venable is Senior Director of Privacy and Cybersecurity Services. I help clients basically build privacy programs, like if a organization doesn't have a privacy program, I help them establish one. Or more often than not, it's improving existing privacy programs. And I like to use the NIST Privacy Framework to do that. My background, it's 20 plus years. I know, because I'm so young, it's hard to say 20 plus years.

Christine Owen:
You started when you were like two, right?

Jamie Danker:
Right, and I was a child bride and all the things.

Jamie Danker:
But yeah, so I started off my career in audit, so 15 years in civil service. Government Accountability Office was my first stop, which was more of an accident than it was intentional. I've been an intern at GAO, I think, my junior year of college at George Washington University. And went to Arthur Andersen right out of college, which, at the time, I was like, "Oh, it's a big five." It was the big five back then. And it was like September 11th, and then the Enron scandal happened. So I ended up calling GAO and said, "Hey, is that offer still open?" And I ended up there and that set me on this journey to be in civil service for 15 years.

So, I was on a cybersecurity and IT team, had a mentor in counsel's office that really taught me everything about the privacy act and including the legislative history behind it. I just found it super fascinating and found myself on lots of audits and reviews, federal agency privacy programs and compliance with the Privacy Act and things like that. So flipped over to the operational side, I did a short stint at Deloitte, ended up at the Department of Homeland Security's Privacy Office, which was the office I had just finished auditing at GAO, which was cool. Because even as an auditor, you actually have feelings about the programs you're auditing and I always thought it was just really fascinating. This was the first statutorily created privacy office.

So it's cool. I helped them write their information sharing policy, and then it was a consulting fail where it's like you fall in love with your client and then you just end up like, "Hey, will you hire me?" So I went back into civil service and I stayed at Homeland Security for another nine years. So it was headquarters and then I went to E-Verify to be the privacy officer there and that's really the kind of where identity was introduced to me. So I ended up working with NIST a lot, trying to make our privacy act system of records notices accurate and describe really what's going on in my identity, and that's how I got interested in it. Got roped into working on Digital Identity Guidelines that way.

Christine Owen:
Wait, you didn't just get roped in, Jamie, you are-

Jamie Danker:
I'm the author, yes, that is true.

Christine Owen:
Yes. Thank you.

Jamie Danker:
I am the named author on the Digital Identity Guidelines, regs three, the Privacy Requirements and Considerations. So that was Naomi Lefkovitz and it was Paul Grassi who was like... Because Paul, I worked with him to write a system of records notice for the Department of Homeland Security's online credentials, and I wanted to make sure that the notice covered more than just my particular use case.

I remember talking to Paul a lot and getting a lot of information and then ended up helping out with the drafting of the Digital Identity Guidelines. And then was at NPPD, which is now known as CISA. So he is the privacy officer there. Built that team and then left government almost six years ago and then went to a small IT services firm where I basically ended up supporting the NIST program and writing the privacy framework, which was an awesome experience. And now at Venable a little over three years. So that was a long-winded introduction.

Carole House:
That's a great intro. How cool is it to get to use the frameworks that you've built?

Christine Owen:
I know.

Jamie Danker:
Yeah, really cool. That's fine. I think my problem is I started off in like kind of the judgy mode. You're an auditor and you're telling people all your findings and things like that and I said, "Well, what would it be like to be on the other side to actually have to implement the programs?" And then you're like, "Oh, this implementation it's so frustrating." And then you're like, "There's got to be a better way to do it." And so then getting to work on guidance. So I like to kind of toggle in between the oversight, implementation, and guidance rules.

Christine Owen:
Yeah, I think all three of us have a little bit of that too. So Carole, let's talk a little bit about your background, which is equally as impressive by the way.

Carole House:
It's interesting. God has a sense of humor in my career. I started off in the army as an army officer in chem bio rad nuclear defense, which is gas mask stuff that is super cool, but people don't like it as much when you're in, just getting CS, just not a public experience. Which also meant in the striker brigade that I was in, I was in operations, so getting to do very cool stuff like lead our assistant chief of operations and we were down range in Kandahar. I switched over to intel also doing collection management work. So that means all the assets that watch and listen to people, my job was to help make sure that they were pointed in a useful direction and rack into that priorities.

Jamie Danker:
Is she a spy?

Carole House:
It was so cool. I loved that. Then I got out, did some grad school for security studies, and then my first tour in the White House was at OMB helping to oversee civilian agencies' cybersecurity and work with what was NPPD and eventually would become CISA in some of their programs pointed at trying to help agencies as well as industry in better securing themselves.

Spent some time on the hill at Senate of Homeland security, also doing critical infrastructure in cyber policy. And then went over to treasury. There's this Bureau at Treasury called FinCEN, the Financial Crimes Enforcement Network. They're the anti-money laundering encounter and terrorist financing regulator. And I found out that they had a cyber section. So I went over there, because I'd sort of gotten to see from this perspective that, "Okay, cyber crime and financial crime have a massive overlap." Like a huge amount of cyber crime ends up being either to steal information or credentials to monetize later or directly to conduct the fraud of the crime, the financial crime, and then do laundering for after.

So I was the lead for cyber, crypto and identity policy there, which was super fun. And then I went to the National Security Council, my second tour in the White House where I was driving our counter ransomware work as well as our work on cryptocurrency, future of money and digital identity, which was super fun. Now I'm at Terranet Ventures and doing a bunch of work with some nonprofits and think tanks like the Atlanta Council, Georgetown, the Digital Dollar Project and Third Way's US-China Digital World Order Initiative. So it just been tons of fun and it's great, and I'm thrilled to talk to you guys about identity. I just got back from an identity conference focused on identity theft and fraud.

Jamie Danker:
Where were you this time?

Christine Owen:
That's in Nashville?

Carole House:
It was Nashville.

Christine Owen:
Oh. Yeah.

Carole House:
I went to the Nashville conference. Yeah.

Christine Owen:
But usually you're somewhere else in the world like Japan or Canada.

Jamie Danker:
Yeah, she's like the jet-setter.

Christine Owen:
Or like Switzerland. So it's pretty nice. So the one cool thing is the three of us actually with the fourth, he's also a privacy expert, we tend to get together at least every other month and we have tea and we talk about the issues that we are confronting at work. And we also have tea as well. We've got a little bit of tea usually when we talk. But the one thing that we have found, and partially it's because the other person who unfortunately can't be here today, is that she, the privacy person, and I, had to collaborate together when I was doing security work. And in that instance we found that there was a really big need for different functions within that whole cybersecurity framework to be able to start working together more efficiently, because really all of these areas are siloed.

For example, anti-fraud is 100% siloed from security, which is siloed from privacy. And none of that makes any sense, because we all need to work together. So I'll start with Carole this time, and let's talk about how you've seen an effective way for the different functions under the broader scope of basically just making sure we protect our stuff. How can we better do that in our observability?

Carole House:
Yeah. And that was a huge issue for us when I worked at FinCEN that bureau at Treasury, because there's this issue where a lot of security functions, fraud and things like compliance and anti-money laundering obligations, this is things like your customer requirements and requirements for financial institutions like banks to monitor transactions. Looking for suspiciousity and then reporting it to us in suspicious activity reports. And we noticed that there was this huge silo, and it was wild that things that an institution would see were suspicious and maybe file a SAR on, wouldn't necessarily be shared with their cyber team or with their counter fraud team. And so even if it was detected for suspiciousity, and so there might be appropriate reporting on it, they weren't flagging that it was suspicious, because it seemed like it was an account takeover or a fraudulent account. And so then the measures for those to try to deny fraudulent accounts to get access to their systems or integrate a maliciously known or suspected IP address into their cybersecurity systems to try to prevent them from getting access to their networks.

So this issue was something that we recognized at FinCEN, which is nice when you're at an institution that has the benefit of getting reporting that can be at the intersection of these things. That's often where people see that connected tissue that you ladies talked about, because you guys have served at the nexus of these things. So at FinCEN we issued some guidance and advisories highlighting this intersection and encouraging that these silos ultimately need to be broken down and that what you needed to have was a much greater collaboration across cyber, fraud and anti-money laundering functions.

And while it hasn't necessarily adopted across the whole financial sector, a lot of institutions, including very major banks, have taken really good measures to implement better controls. And FinCEN started to see massive improvements in reporting when cyber activity was seen as suspicious and it was touching a transaction, an attempted transaction. That stuff was now getting reported to FinCEN with malicious cyber indicators and stuff associated with that. And then FinCEN sanitizes those indicators and shares them back out with the industry via the FS-ISAC so that they could better defend themselves. So that's just one example of where I think it shows the government recognizing it, helping to equip industry, industry fixing some of that problem and then better equipping government to then help industry more on detecting illicit stuff on their networks.

Christine Owen:
Yeah. And I think that's really important. That's something that Carole and I have been talking about recently, is how is it that the government can help private companies be in a position where they feel comfortable with reporting their cybersecurity instances? Because right now, it seems like they get dragged through the garden if it's something bad happens, right? So the government needs to make it a safer space, for lack of a better word, for companies to be able to go and report any issues or specific suspicious activity that they see. Which is really important, because as we've seen with something like Scattered Spider, what happens at one company is repeatable and it happens in every single company. And that's something that we have seen. We've seen multiple companies need the exact same tool to be able to combat some Scattered Spider issues.

So for privacy, this is something that Jamie and I have been on a road show on for a couple of years now, is the intersection of privacy and security and how privacy is a very important piece of cybersecurity. And cybersecurity is also a way to accomplish the privacy things that you're trying to accomplish. So you have some really good knowledge, not just from the government, but also from Fortune, I don't know, 50 companies. So what have you learned when security and privacy work together?

Jamie Danker:
I think it's awesome when they do work together. I mean, in civil service I had the fortune, I think in being at CISA, I was close with the CISA side and I think I got a lot more done. I don't think it's a secret that they are generally under-resourced. In contrast, you could argue that CISA is possibly overresourced. But comparing apples-to-apples budgets generally CISA has more requisition needs than any other agency in my opinion. So I think the better friends you can be with the CISA shop, the more effective you can be.

HF did a lot of that and sort of injecting policy requirements in the sensitive system security policy. So it's compliance related, but it's kind of the forcing mechanism to have visibility in what's going on within the organization. I've seen organizations I think at Fortune 100 that are siloed and you kind of see these regular issues where it's like the security team is moving forward with this, or a product team is moving forward with something that clearly is going to have privacy implications and they're always afraid. I like this phrase that the privacy teams going to put the no in innovation.

Christine Owen:
Oh, I like that phrase. Of course you came up with that one. Pointing it out.

Christine Owen:
Of course.

Jamie Danker:
So I feel like I'm the kind of professional that would like to get to yes buts and kind help companies strategize on how you can get to yes and how you can build a program that can be compliant with laws but manage privacy risk at the same time.

Christine Owen:
And I think you're right. I think a lot of people think of privacy as it's a yes or no function. But really it's a, "How do we put guardrails in the process to make sure that people who shouldn't have access to that data don't have access to the data?"

Jamie Danker:
I think it even goes beyond that. We were talking about the intersection of privacy and cyber and I really like the NIST Venn diagram that kind of articulates the intersection. We've got confidentiality, availability and integrity when it comes to cyber security, so obviously confidentiality is that shared privacy objective. But in privacy, there's also so much more than that. Strong identity obviously protects privacy in terms of unauthorized access, but when you're establishing an identity, you're collecting a S-ton of personal identifiable... I'm censoring myself.

So when you collect information, it also creates risk theater. And so the risk that cyber doesn't necessarily concern itself with are kind of purpose-based information. We collect the information for identity purposes and you don't expect that information then to be used for targeted advertising. We've seen companies get in trouble with that before where they establish multifactor authentication and then they use that text message to target you for ads unbeknownst to the consumer, which causes surprise and probably can result in fines.

Christine Owen:
Yeah, so I definitely, I think that those are all really good points and I think those are things that, like you said, product owners don't always think through. Yeah, I really need this data, but then once that data goes into some sort of data lake within the organization, other parts of the organization go, "Oh, that's all good data."

Jamie Danker:
And a lot of things, it could be honest mistakes, but there are a lot of, I'm not a lawyer, but there are a lot of legal obligations too with NIST state privacy laws. You've got data subject access rights. And so think about if you're copying your data over to multiple places and an individual requests all the information you have on them, you'd have to have the capability to be able to provide that back to the individual.

Christine Owen:
Yeah, no, I agree. Oh, yeah.

Christine Owen:
Yeah.

Carole House:
Coco agrees.

Jamie Danker:
She does.

Christine Owen:
Yeah, Coco is such a good girl on privacy. She says that's exactly how it works. I can't see her. She's off camera going, "Give me some pets."

Carole House:
Yes. I really love both of you talking about privacy. It's funny. So coming from financial regulatory space and also dealing a lot with cryptocurrency stuff, a lot of discussions and issues get raised around privacy and frankly the way that they're framed and what they're really doing is complaining anonymity with privacy. And both of you talked about data being critical to privacy, because privacy doesn't mean the absence of data. That means that there is data that's sensitive and critical and important, but that there's some kinds of protections, permissions and conditions behind how it should be disclosed or shared. And I just like those of you were hitting on that on the way that you were talking about privacy. And it's just like why I love talking to these ladies, because they understand what privacy and security are. And a lot of people conflate these things and don't understand the nuances that come in. Privacy doesn't mean the absence of accountability. It in fact demands accountability.

Jamie Danker:
Yeah, information practice principal that is accountability. But I am 100% with you. But people have such strong feelings when they talk about privacy. And so I was actually talking to somebody who is in law enforcement and her whole thing was she kind of, in her personal life, slipped off the grid. I get not wanting to engage in social media, but she uses cash and all these things. I just thought it was kind of ironic, because it's like law enforcement couldn't track you. But some people really feel strongly that they don't want to be tracked in any kind of capacity. And then maybe you've got the people who say, "Well, I've got nothing to hide," and I don't agree with that argument either.

Carole House:
Yeah. There is some legislation just passed the House, I don't think it'll make it through the Senate, but it's an anti-central bank digital currency bill and it mostly was voted on via party line there were and Democrats that voted for it and a few Republicans. This legislation that was just passed by the House, it bars any central bank digital currency from being issued by the Federal Reserve, which includes wholesale or retail, which basically means either retail, like something that can directly reach out to consumers or central bank to central bank. And even the Congressional Budget Office said that it might even prohibit existing digital forms of central bank reserves which would mitigate the US central bank's ability to currently conduct monetary policy. So I'm glad that this does not likely have any legs on the Senate side, but there was a specific aspect in there about their desire for the privacy features of cash to exist in any future of digital money, which the feelings around since I don't think that the exact same features of privacy with cash exist in any cryptocurrency.

There are kind of comparable but not exactly equivalent, it's not the same. With cash movements I can pass $5 from me to you and I don't need a third party to do it. In cryptocurrency, while they call it peer-to-peer, it does require third parties like validators and miners to validate that that is a valid transaction that gets published to the public ledger. So it may not require a bank to take the money and move it somewhere, but there's an interesting role that third parties play in crypto. So I don't think it's exactly equivalent to cash. And then also most cryptocurrencies published to public ledgers, which cash does not. So it actually, in my view, was asking for privacy features in digital money that do not presently exist in basically any cryptocurrency or at least not exactly. So it was interesting that the first privacy legislation that passed Congress wasn't a consumer focused privacy statute.

Jamie Danker:
I can see that. I mean, I hear your points, but I also feel like from a consumer point of view, if I want to use cash, I might want to be anonymous.

Carole House:
Yes.

Jamie Danker:
I don't think that your privacy should be worse just because you go digital and you're probably not saying that either, but I think that applies even for going digital in any kind of credential that we shouldn't design things such that our privacy is worse. There's privacy enhancing technologies. What's the opposite word of enhancing?

Christine Owen:
I don't know.

Jamie Danker:
Privacy reducing. Yeah?

Carole House:
Reducing. No, and I totally agree with you. My view is I guess the irony of pointing to crypto priorities as if they're private when they're not private.

Jamie Danker:
When they come off private, yeah. It's like apples to apples.

Carole House:
Exactly. To me, my voice of being against this bill was for a lot of other reasons, but on the privacy piece, I wasn't saying that it shouldn't be private, because in fact, I think that the only way for it to be adopted with consumer trust would be with private and preservation. It's just that it won't look the same as with cash. And so that's where I was like this would warrant further explanation.

Jamie Danker:
Privacy risk assessment perhaps.

Carole House:
Yeah, exactly. Using Jamie's framework. That's what we should do.

Christine Owen:
So I agree with all that. I think part of it though is so I agree and this specific use case is totally different, because it's cash, right? We're talking about cash placing. Although if you digitize cash and then don't have a ledger, then you can't actually go and see where the laundering occurs. So it becomes a bit of a problem on that. But let's move it over to the security side. So I actually was thinking about this while we were talking, FIDO, which was created to preserve privacy, so it's a very strong phishing resistant token that allows for someone to go and log in and for the relying party to know that there is a strong likelihood that that person is the person who has the credential. Not that they're supposed to have the credential, you have to add additional things to it like identity mining.

But they more likely than not actually are in possession of that token. I'm going to go into a little bit of FIDO. Currently, FIDO it is talking about adding prominence and other things that in some cases some people who are very into the privacy preserving, they say, "Well, you're stripping the privacy preserving, because you're stripping away the anonymity and amenity." It's hard work.

Jamie Danker:
Identity words are so hard to say.

Christine Owen:
They are hard.

Jamie Danker:
Like synonymous.

Christine Owen:
It has nothing to do with what we have in our hands. It's absolutely just the words. Yeah, like you said, it's hard to say. Which was a very good idea as was for example the whole concept of blockchain and be able to do everything anonymously. But what we're seeing in security is we actually need to have those ledgers and we need to be able to see what's going on in audit trails and we need to see the provenance of the token that's coming for various reasons. So I don't know if you guys have any thoughts on where it is. I know that I just sprung this on you and I'm the one imposing all of the FIDO on you guys, but it is something that isn't openly discussed about within LinkedIn. The idea of how do we preserve privacy and not anonymity, just privacy while also keeping a stronger security posture.

Jamie Danker:
So I think you have to look at retention schedules and I think you have to look at stakeholders who are product experts. And this is one area I felt like I was blind on, not for lack of trying when I was in government. That data minimization, you can accomplish it through many ways. One is having a retention schedule that is reasonable and not 100 years kind of situation. And so I think you need to have fraud experts that talk about what is the length of time that is needed for law enforcement to prosecute a case.

Christine Owen:
Seven years.

Jamie Danker:
Okay, seven years. And that be even too long. And so I think there has to be some sort of compromise, because I think generally all these issues I think are too complex for the public to really comprehend. So I think it's on us to make decisions that balance things like fraud, cyber security purpose, and privacy altogether. So maybe it's not seven years, but I think it depends on the use case. I think for risk transactions, your retention might need to be a little bit longer or you kind have a tiered approach. You can only access the data in certain use cases. I think there are a lot of ways to kind of get through. I don't think it just means that you retain everything indefinitely. You have to have reasonable retention schedules.

Christine Owen:
I actually really like where you're going with this, because you're bringing up something that I haven't thought of before. And I want to hear your thoughts, because you've dealt with this, but when it comes to fraud, so anti-fraud measures the retention for companies who are working with places that could have fraud, their retention rates are seven years. They are required to keep all of that, PII, all the identity documents for seven years and it's to help investigators. That's the whole point. But quite frankly, if you don't have something started within one to two years, are you really ever going to be able to prosecute those people? Yeah, what do you think about that?

Carole House:
So I do actually think that yes, we've seen instances and trust me, I've been a voice for a while about the challenges that I have with how long it takes for us to bring cases. Often there's reasons for it. Due process takes time and it should. Certainly in my world where we just brought, there was a money laundering case against an institution that was pointing to that and OFAC sanctions violations and I think from eight years ago. And actually one was from 10 years ago, and we have some soul-searching to be doing on our side about why it's taking us that long. Because if the point of especially things like civil enforcement where the point isn't just to hold accountable bad guys, but the point is to shape an ecosystem and a sector, like the financial sector to be compliant and to mitigate risks, whether or not these timelines are actually doing the job that we want to be sector shaping and not sector breaking.

Beyond that and then the fact that we do need to figure out how to scale enforcement, industry doesn't always like to hear that, but actually early and often enforcement normally means smaller enforcement actions and helping to whip a sector into shape instead of crashing it down on fraud things, which is criminal activity. And this is where it's tougher, because I mean we've definitely seen cases that point to activity that either a crime ring or even an individual had been guilty of years before, but ultimately for it to raise the level of reaching prosecution, it's often got to be certain amounts and other things and we're definitely not doing enough to hold accountable the bad guys. I think the fact that we've now stoked a blockbuster movie starring Jason Statham to go after identity thieves-

Christine Owen:
What?

Jamie Danker:
I didn't know about this.

Christine Owen:
I don't know this movie.

Carole House:
I watched it. It's called The Beekeeper and I watched it going to the identity fraud conference that I just went to, which was also the funniest way to watch that movie on the way to identity fraud conference. And it is exactly what you would get from a Jason Statham movie.

Jamie Danker:
Sounds lovely.

Carole House:
Yeah. But the fact that we got a blockbuster movie that everyone that I know in a lot of identity spaces were saying, "Oh my gosh, everyone has to watch this movie." And I'm like, "These were the bad guys. The guys that were going after the most vulnerable in our population." It's fascinating and demonstrates we've gotten to the point where society does want us to be able to do more with this. I hear you definitely on the data retention schedules and there's always going to be this tension with law enforcement about what is that timeline that should be there, the requirements that we put in place for banks and others to retain information over things like the customer identification program requirements.

And ultimately the biggest investment that we really need to help solve all this is that wouldn't it be great if we had a digital identity ecosystem where I don't need everyone to be able to hold and retain the information that then could be... I wish we were in a position where we wouldn't do this, but where we could publish the social security number list and it wouldn't result in a dime more abroad. Again, don't want to publish that list. I just wish that we were in a position where we had built the identity infrastructure that there wouldn't be any more fraud because you happen to know essentially an open secret. That is what I wish that we could do is invest in that digital identity infrastructure which would then enable us to-

Jamie Danker:
Preach, Dr. House. Preach.

Carole House:
Yeah, I know. I know. And Jamie has been the data minimization like gospel singer for a while and it's totally right. This is how we ended up in exactly the situation of giant tech companies having massive amounts of data about consumers and just not enforcing these kinds of requirements. And ultimately none of us, I think even understand how to redirect an entire economy that isn't based on harvesting people's data. This is why I think Congress hasn't been able to make meaningful progress.

Christine Owen:
Well, I actually think that there are standards that have put out that actually has figured this out. So what you are basically preaching, which I fully agree with, is the concept of digital wallets. So you as a user, as a consumer, you want to have your own data protected, someone else has to protect it for you, because can't build your own FIPS encrypted module for yourself, right?

Jamie Danker:
Speak for yourself.

Christine Owen:
All right, well I'm going to hire you again.

Christine Owen:
I know I can't do it.

Carole House:
It's okay.

Christine Owen:
No, we have built-ins, right?

Carole House:
Oh, you're good.

Christine Owen:
Yeah.

Carole House:
Sorry.

Jamie Danker:
She's building it.

Carole House:
Jamie and I are using it.

Christine Owen:
Yeah. So verifiable credential, what you explained is essentially what we need in the future. And I am a big preacher for proponent of verifiable credentials, because I want to be able to keep my information in a place where I know that only I have access to it and literally nobody else has access to it. And then if I want to share my information, I can share it, but I don't necessarily want to use my identity documents like my driver's license or my passport to be able to authenticate myself into something. I need to be able to use a different credential that is attached to that VC to do it. But also when I'm shopping on a website and I have it registered in that website before and I need shipping information submitted, I want to be able to use that verifiable credential to pre-populate the stuff, because I'm too lazy to type in my address. So I think that there are definitely use cases where we can do that. And I totally agree. I think putting all PII or identity documents into a data lake is not the best option, right?

Carole House:
True.

Christine Owen:
Yeah. What do you have?

Carole House:
Exactly. Wouldn't it be great to render all of those breached PII links available on the dark net to just render them moot and useless?

Jamie Danker:
That is so funny. I feel like you've been read my email, you guys were probably all subject to the OPM data breach.

Christine Owen:
She loves reading your email.

Carole House:
Yeah, to the theft side work that I do for that.

Christine Owen:
Yeah, she absolutely did.

Jamie Danker:
She's a dark web agent. Actually I got a notification from I think whoever the identity monitoring provider is for the OPM data breach.

Jamie Danker:
... never risk that on the dark web.

Christine Owen:
Duh. Were you like, "Duh?"

Jamie Danker:
I was not alarmed by it. I feel like 10 years ago I might have been alarmed by it, but it's like I have multifactor authentication established.

Carole House:
This is the public service announcement.

Jamie Danker:
Yeah, public service announcement, have multifactor authentication. I had an experience this summer in Europe where my spouse's identity was stolen and it was kind of embarrassing in a way, because I felt like I failed as a cyber and privacy professional in some ways, because he had used the same password across multiple sites.

Christine Owen:
He didn't have MFA turned on?

Jamie Danker:
But in fairness he did for the financial accounts have MFA turn on. But it was still very anxiety inducing, because it's like I know that... So the funny thing is in the digital world, I'm becoming more of a digital adopter, because I feel like I cannot not remember all the things I need to carry with me all the time. But it's that sinking feeling when you left your purse somewhere or you left your wallet. I mean, it could be the same feeling in the digital world.

Christine Owen:
So the interesting thing about this, and I think that this is really important, because I was an advisor on this summer issue, is that one of the things that was stolen, it was stolen because he used a password that was actually on the dark web, he didn't know. So that username password combination was in the dark, they used it. They did probably credential stepping. That's my guess. And when they did that, it was a social login that then got into other accounts and that was the issue. It was the social login aspect. So for those of you listening, if you have a social login, make sure you protect it with multifactor.

Jamie Danker:
You don't want to lose a family vacation. And by the way, it wasn't.

Christine Owen:
Actually, let's be honest, it was slightly better.

Jamie Danker:
We got to rest.

Christine Owen:
Yeah, they got to rest. The ladies got a rest day, he got an anxiety, but it was a rest day for that.

Carole House:
Yeah. I remember when my Twitter was hacked and I didn't know that it was because I never use it because just a lurker on Twitter to watch news feeds and things and to see what crypto Twitter is saying, I've just never posted anything. But someone told me that, "Hey, you just liked something." And I'd never liked something on Twitter. And I went and looked and there was a, I will just say a political group and video, associated video, that I liked and had followed that was absolutely not me at all. And I was just horrified, because I hadn't paid attention to that account. So, because I created it way long ago in 2014 before I was even in cyber. And I just felt so bad like working in cyber, like oh my god, because I've never used that account and so it hadn't even occurred to me.

Jamie Danker:
Yeah, yeah. I guess we all have, even as individuals, we have our own cyber hygiene issues that we have.

Christine Owen:
Oh, absolutely.

Jamie Danker:
Like the organization, so I was just at a conference for the National Association Public Health Statistics Information Systems and they had this fantastic presentation. It was the assistant U.S. attorney were I think covering Kentucky and an FBI cyber agent. And they talked about this case, I cannot not remember the name of the guy's handle. But essentially this guy had been able to steal credentials for the state of Hawaii's death registration system. And so here's where all the cybersecurity confidentiality comes into play. The password was simple, the account hadn't been used in two years, so the doctor had no idea this was created. This individual went in and created a death record for himself. And the story was that he basically was doing it, because when you have a death registration certificate, it feeds into other government systems, has impact on financial, if your wages are being garnished, because you're trying to skip child support, which is ultimately what this guy was trying to do. He's just trying to kill himself off in records.

So you like faked his own death in records. I'm going to find the link and send this to you guys. And so the agent and the assistant U.S. attorney had to walk through all the facts of the case, which was fascinating. And there were all these representatives from the state by the statistics agencies and it just really kind of, for me, drove home that cyber hygiene is really still an issue for all types of organizations and sectors. So the lessons learned there were like... And when these agencies don't necessarily have the funding to prioritize strong cybersecurity, but there are definitely lessons learned, best practices, you probably shouldn't allow an account to exist and you should put it in a suspension or some sort of process, have multifactor, all these different things. And apparently this individual was also trying to then sell access to other state FIDO records things you can create fraudulent birth certificates, you could create fraudulent death certificates. So to me it just kind of drove home that cybersecurity is definitely a national security issue. It exists in every single sector.

Christine Owen:
So what's really interesting is that is essentially the use case, but on a larger scale from bad nation state actors that I have been really rattling about, because what that person did, they didn't steal PII. They got in and they created a record. And what I am worried about the most in national security inside the US government is a bad actor coming in and creating records that are not real, that are fake, intelligent records for example. Records with a critical infrastructure, records on how systems are working, things like that. Put additional audit records or delete and add audit records into a SOC, for example, and creating mass scales. That's my biggest fear. Definitely using a password on systems like that is going to help that bad actor for sure. But it's interesting, because from a privacy perspective, do you think privacy also includes adding data or manipulating person's data and adding something to their record? Or is it only reading and deleting and taking?

Jamie Danker:
Oh, I mean I think modification of someone's record, because I think privacy looks more at harm to individuals. You've got cyber security risks and then privacy risk. I like the definition and it's basically the risk to individuals. I forget the full definition, but you basically, privacy risk you think about it as the individual experiences the direct impact, and that could be a harm related to that's tangible, like economic loss. And then you've got the kind of intangible things like dignity related harms or embarrassment and that ends up recruiting to the organization. Oh, gosh. I mean, I was scanning through Netflix last night and they had the Ashley Madison. So I'm just going to start watching that. But that's an example of privacy and security, because not to defend all those cheaters out there, but that's pretty embarrassing and heartbreaking for all those families. So that's harming them. How you feel about the website. That was both the privacy and security, that situation.

Christine Owen:
Absolutely. So what keeps you up at night?

Carole House:
I know national security nerds, like all of us, I work good at dreaming up worst case scenarios. And I'll say that I guess some of the stuff that scares me a lot is I like buzz words. So the convergence of AI and blockchain are really interesting to me.

Christine Owen:
What horror.

Carole House:
I know, I know. Okay, so I'll start with AI, that of course AI isn't new. If you haven't been using AI either in the conduct of your own activity or things like compliance, you really haven't been competing in things like finance and the sectors that I've been in. AI and machine learning. And first off, the growth and sophistication of AI just growing, but now the democratized access, this paradigm shift towards democratization and what inherently comes with tech as it becomes more open source and available, this democratized open access to really, really highly sophisticated compute and AI capabilities and a lot of the large language models is now making almost essentially trivial the ability to circumvent most of the basic fraud measures that we have been telling institutions for years to put in place.

So this issue like deepfakes and trust me, I'm also concerned about the integrity of elections and this issue of the fact that we have not built in place the digital ecosystem for tracking and enabling things like provenance of data and information. And there's so much behind all of this because once you start getting into centering information, you hit the first amendment every time also. So there's really complex issues that challenge the ability to put accountability in these systems and in systems where we've liked openness, we preferred openness, that was why 230, Section 230 was what it was to help enable the internet to grow like it did.

But now we do have both parties being interested for coming for things like platforms and Section 230 while also no one having a viable alternative to improve Section 230. But AI, this democratized access to really highly sophisticated capabilities. And even though most of them are positive commercial applications, I am excited about AI, have been. My GitHub is hilarious, but I have built machine learning models and used them in a couple of... Whether for FinCEN or for work I did with CNAS on state veterans benefits and mapping them and looking at trends. I love AI. But then there's blockchain which has the goods and bads that we can get into some time including related to horrifying privacy implications.

Christine Owen:
But wait, let's back up. When you talk about blockchain, you're talking about public blockchain?

Carole House:
Yes, public blockchain and encrypted. Yeah.

Christine Owen:
That is a huge difference, that private blockchain.

Carole House:
Sorry, that's a great point. Sorry. The way that I got introduced to blockchain was exclusively the public ones and cryptocurrencies. Where everyone excited about this new, very, very cool technology that combines blockchain as two words as I knew it, blockchain and cyber encryption with distributed computing and now you get blockchain as one word essentially. That's a very rudimentary way to describe it. And then they chose finance as their first use case and I'm like, "Great. Awesome." So basically publishing publicly people's financial transactions, however, a part-

Christine Owen:
Which is what Venmo was doing for a while, even though it's not a blockchain, but they were doing that.

Carole House:
Yes.

Christine Owen:
Also very scary.

Carole House:
Exactly. That that was the default setting.

Carole House:
I know. Why do people want that?

Christine Owen:
You don't need to know that I gave you $5 for Thai food.

Carole House:
Yeah, there's a comedian I like a lot who mentioned, he made a joke about how he was surprised that all these people got on Venmo and he was like, "This was for drug users." That is not how I characterized Venmo. Let me be very clear that I did not characterize it that way. That was just so funny bit by John Mulaney, the comedian.

Christine Owen:
Oh, yes, I remember that. It was good.

Carole House:
I thought it was a funny bit by him, because I use Venmo and other things. And so it was just funny to me that that was a bit that he did, because where the default setting is public, that was interesting. But blockchain as a technology is really interesting. And actually I think that these two technologies help reinforce each other's best and worst traits in really interesting ways.

Carole House:
Because AI is in need of things like auditability, data integrity, provenance tracking, blockchain is made for that. The problem for crypto is that the guys who first deployed most of the use cases we're deploying it as if it was private and privately enhancing him. And I'm like, "That's not what blockchain was made for to obscure things." Now it can help provide things like the accountability-

Jamie Danker:
Accountability.

Carole House:
... angle of privacy, but that's not the way that Bitcoin and publish publishing... Sorry, publicly publishing your transactions was implemented. So I think the blockchain could actually, if it's looking for a use case, and I think that there have been many use cases that have been explored, some of which successfully and some of which not so much, but blockchain seems to answer some of these calls that AI is seeking and blockchain's problems on things like scalability, AI can help solve. I think that these technologies have very interesting applications for each other. My problem is that with blockchain and specifically decentralization, so the most public, the most unaccountable, the issues that come in DeFi of lack of clear lines of accountability, this is where that combined with AI so scaled on accountability are the things that that's the greatest fear that I have.

When you have a malicious cyber botnet that is being command and controlled by a distributed command and control node that's not operating on a couple of devices or servers, but it's on 1000 devices around the world. How do you take that down? In Japan, at Japan Fintech week, there was an event that I did and I lived during the panel, I pulled up ChatGPT and I was like, "ChatGPT. Hey, buddy. If I deployed you on a decentralized network and then you started to act or operate maliciously, you got taken over or your code had a vulnerability in it or it just wasn't written well, because we don't write code very well, that's why we have so many security problems, could I take you down?" And its answer was, "It depends."

And I'm like, "Great answer." And it said, "It depends on how good my security is, how decentralized I am and how good your authorities are." I was like, "That's a great answer, ChatGPT. I totally agree with you." So I'll stop talking now. But basically it's this convergence that I'm like, okay, this is... Talking about worst case scenarios, this is something that I'm going to keep an eye on certainly.

Christine Owen:
Yeah, I agree. Deepfakes is probably the number one thing that is scaring me right now. And it's something that I feel like we need to re-educate security professionals on. Because right now we think, oh, I see you through a video and it's live, it's on Teams, whatever, then we're good. It's not. It's not trustworthy. We have to do checks on the backend and that's something that we do with my company. We have to have those checks and they're very important to be able to detect liveness, like true liveness, not there's somebody on the other side.

We've also found that in doing our research, we found that the price to get these deepfake overlays to be able to defeat a person, it's not generally you can't defeat AI because we use AI on the backend to do the biometric checking. But to defeat a person, it doesn't cost that much money. It's actually really cheap. And if you are looking for a small investment, and I mean small investment to be able to make a lot of money, boy, that's the thing that you do. You go and buy one of these programs.

Jamie Danker:
Forget the stock market.

Christine Owen:
Oh yeah. I mean honestly-

Carole House:
I fear this Jamie's launch into a line of crime.

Christine Owen:
Jamie is now-

Carole House:
15 years in civil service and it just goes downhill.

Christine Owen:
Hopefully we have a seven-year retention policy for this video.

Carole House:
No, it's going to be there forever.

Christine Owen:
Yeah, no, we don't have it public.

Christine Owen:
On the other side, this is actually, so I'm going to get a little personal here. I've actually, before I joined 1Kosmos, I was never into blockchain. Blockchain bros came to me like moths to a flame and I'd be like, "Yeah, that sounds great. No. Yeah, that sounds great."

Carole House:
I like you being the flame in this scenario, by the way.

Christine Owen:
It's the red hair.

Jamie Danker:
Never trust a ginger.

Christine Owen:
But at this point, now that I understand how the private and permissioned blockchain works, I realize actually the security measures for that, for PII is so much stronger when you give the private key back to the end user, then if you have obviously an open blockchain, public blockchain, not something that looks good. But the encryption within blockchain is so strong and it's so important to have that if you find the right use cases and build it the correct way, it actually becomes a privacy preserving mechanism.

Carole House:
Exactly. And I agree that it's those right use cases, and I love it being used in security things like immutable records.

Christine Owen:
Exactly, that's what you need.

Carole House:
... the supply chain. These are the right use cases, but these are the right use cases. And it's not that you can't fix the public ledger issues, because you can with other encryption and obfuscation, but they have to put in place other mechanisms that often they haven't or they haven't done so in a way that's lawful or enables other types of security issues to be accounted for. And so there's lots of broader issues, but I love leveraging blockchain for other really interesting enterprise use cases.

Christine Owen:
Yeah. And the cool thing is that this is something that's actually gaining traction within the EU. They're starting to realize that for a strong digital wallet, blockchain might actually be the right technology. It's just how do you build that blockchain the right way to be able to make it privacy preserving, and also make it strong in security, which is really important. All right.

Oh, especially Jamie. Oh, Jamie's smiling less. Jamie's smiling, it's almost time to go. So the one thing I wanted to end on is that we share tea every other week, month, usually, not week. Oh my gosh. I haven't been in town to be able to do every... but we try to do tea every other week and four of us women. And we also try to have additional events with a lot of women. And we do focus, we obviously have events that we can, like Carole has a really wonderful happy hour that she has cultivated throughout the year, so it's so much fun. But what is it about having a group like this where it's only women and cyber security professionals? Why is it important? No, why is it important?

Carole House:
Yeah, so I'll kick it off-

Christine Owen:
Yes, kick it off.

Carole House:
... because you two are such an important part of that journey for me. And also I know Jamie who couldn't be here and just other really critical women in the identity world, like Honey over at NIST is amazing.

Christine Owen:
Absolutely.

Carole House:
And Naomi is amazing. And honestly, I think, and I'm a part of this, there's an executive women leaders fellowship that I'm a part of that was sponsored by the women that stood up, Fortune's most powerful women list. And I loved their thesis behind why they launched this was because they were first looking at what... This year, I think was the first year since the early 2000s, maybe around 2005, that the percentage of women like CEOs on the Fortune 500 list went down. So in almost two decades, it's the first year that it's actually decreased. And it's hovering around, I think I'm going to say 12%, but I know that it's in an article online. So that's us going up for the last two decades and we're still hovering there. And as they were trying to figure out what is this? Is this a pipeline problem? Do we need to cultivate and create more talent? And their thesis is that, no, it's not a pipeline problem. There are amazing women, they don't have the championship and the access.

They don't have the other people helping to build each other up, find each other's connections, figuring out where those fusion points are to go off and to be able to be given the opportunities that others are. And obviously there's many that are in disadvantaged populations that don't get opportunities. And I'm in the venture space and the number of women in venture capital is extremely minimal and especially minority women is less than a percent. It's wild. So I think I've been honored to be a part of this, the journey to lead fellowship, and I'm thrilled to be with the most incredible community of women there. But in cyber, a lot of the female leaders that I've latched onto were just the most kind and open with information, with connections, with contacts in a way that helped me grow, whether I was getting deeper in a specific manner that I already cared about or touching something else. And then through people like you, I know I threatened for years for both of you to get into blockchain and crypto and-

Christine Owen:
And then I did it.

Carole House:
Exactly. And we're getting Jamie.

Christine Owen:
Eventually she'll get there,

Carole House:
Because ultimately what I've seen with ladies like you, I saw a space that was in desperate need of your expertise. Where I was like, "Oh my gosh, none of them understand how to talk about privacy. They need anonymity." And like, oh, all these things need privacy frameworks, all these guys need to enhance their security, because oh my gosh, they're getting pumped by Lazarus Group. If Lazarus Group changes from targeting banks and switched terminals to go after your sector because it's so much easier to target and monitor, you have some soul-searching to be doing as a sector. And basically I feel that when I see you guys as the answers to them, and it's not because you're inherently women, it's because you're the leading experts in this space that I can point to as well as many others that I try to bring in also.

But frankly, we see that women need, they need the championship and the access oftentimes more because there's rooms where I see that people just won't say the name of the woman who should be talked about, who should have been in that room, just say her name. And it's often not, I see that and it's something that I've striven for, because I've been lucky to have specific champions in my life who have taught me that lesson and I've learned that from. And I need to pay that forward and search for it. But I think that understanding that amongst each other, because we understand those challenges and then also pushing ourselves and those within our network to say, "This is your moment. I'm looking for a champion. Are you going to be that person?" That's not something that we always are raised to be, inherently to be supportive of.

We're often mentors. Being a sponsor and a champion is different than helping someone grow versus me saying, "No, I'm taking you with me into that room." And sometimes it rocks boats in a way that we're often told, "Don't rock the boat because you're competent enough, you're good enough, you're going to be brought along for this ride," but that's not enough. It's not enough anymore. We need to be willing to rock that boat and drag the other person with us into those rocky waters. I don't know. I feel like I've failed on a bunch of metaphors there. But anyway, that's my feeling about, yeah.

Christine Owen:
No, I mean, you mixed a lot, but it worked well.

Jamie Danker:
I feel like I can't follow.

Carole House:
No, you can.

Christine Owen:
I wanted to add something and I think that it'll make you think of some really good puns.

Carole House:
Yeah, oh, yes.

Christine Owen:
After RSA, there was this really interesting post and it was by a man, and one of the things that he said that he thought was needed in RSA is he recognized that there was a need for more women in cybersecurity. And then someone responded to that post and said, "Actually, I felt like there were a ton of women there." And then I responded as a woman and I said, "Actually, no, I even went to a women's event at RSA, and it was half men." And I'm not saying that there shouldn't have been half men in the room, because I'm not trying to exclude men. But I think that there are instances where it's more important to have women together to bond, to be able to speak openly, and especially when they're in the same career level and working towards the next level together.

So Jamie and I have a couple of different groups like this where they're small groups of women where we're all around the same level and we're all moving towards a higher level together. And it's really fun, because we get to celebrate the wins and we get to help out whenever things don't go the way that we thought they would. And the interesting thing is that unlike Carole, all of my mentors have been men and continue to be that, which is why I thought it was really important for me personally to have a group of women who are in a similar situation so that we can have conversations. So does that help you think of things?

Jamie Danker:
Yes. I mean, I am so thankful for having found the network. I feel like there are folks who kind of brought me in and I was like, "I'm invited." I got invited to some cyber dinner at the NSA and I was like, "Oh my God, these women are so amazing." But I was like, here I was a different perspective. I felt honored to be there. And there was this feeling of being at peace there, because there were women at all kind of mostly mid to senior level, some are moms, some are not moms. And that's what I kind of love about being around women too. I appreciate the life paths that everybody's sort of taken. And I appreciate getting those perspectives too, because I feel like it makes you a better professional to recognize other people's circumstances. So I love it. I have a soft spot in my heart for young women.

And I've mentored when I was a team lead, I mentored folks before. Nothing brings me greater joy than to see them succeed. Maybe that's a little bit of the mom in me. But I like having these great discussions with the 20 something's. They're hilarious. There are a lot of things about Gen Z that definitely drive me batty, because my 13-year-old Gen Z definitely drives me batty, but-

Christine Owen:
I don't know what half the things they say.

Jamie Danker:
Right. They joke but it's like... But it's important to understand their generation and perspective too, because it's like you realize that we need privacy professionals, we need cybersecurity professionals. And I feel like women are very observant and they notice things and I think that attention to detail was really important when you go through these disciplines. But no, I feel like the network of women on whether sitting right here or other groups has been invaluable.

Christine Owen:
Yep. I totally agree. I think the interesting thing is that I'm actually thinking back to RSA. I was in a lot of meetings and there was only one meeting where there were more women than men in that room or in that meeting. And by the way, for the credit of the federal government, it was the U.S. Federal government. And I will say that the government has done a really good job to advance women. And luckily all three of us have been able to work with government, so we have been in those situations. But today, generally, I think all of us, I might be wrong, but I think all of us are generally one or two women in a room full of men, right?

Carole House:
Yeah. In fact, that's so funny, because it reminds me of my first meeting at OMB. So my only experience in government had been in uniform where I often was, I was in the operation shop. And so I was often the only woman in the room in that unit at times, but it never felt... There's this bonding that you get with the military. And so there were certainly challenges that were presented by that though with certain colleagues. But then my first role on the non-uniformed side with a major, major company. And I almost didn't go to that meeting, because there was a side meeting happening and I almost got up and left, but one of the senior male staffers, he grabbed me and was like, "Nope, sit. You're staying here." And even though I was the most junior person on that trip, he was like, "Stay here."

Because as we walked in, he noted earlier than I did that there was not a single woman around this table. There were 20 executives from that company, and the only females in the room were at a different table and the ones offering and brought us coffee. And they were of course wonderful and kind and great, but that that was a problem. And he saw and he was like, "Nope, the government's going to be the ones to show that there's a woman here. They don't need to know that you're the most junior person on this meeting. You need to stay and show this." And I was the one who noted it. He was like, "We're going to send this message." That was really impactful to me, because I hadn't been looking through that lens, and that was really eye-opening.

Christine Owen:
Yeah, no, I think that's great. All right. Well, I am almost done.

Christine Owen:
Carole's about to be done. So thank you so much, ladies, for joining me. All right. Have a great night.