Eve Maler

Wordcount: 6540
Average reading time: 33 minutes
Date published: 3/15/2019
Follow Eve on Twitter: @xmlgrrl
Website: https://www.forgerock.com

Steve: Well, hello Eve. It’s awesome to have you on the Nonconformist Innovation Podcast today. There’s so much we can talk about, but since this is the Nonconformist Innovation Podcast, I don’t want to let our listeners down.  Let’s use this time to explore innovation practices, bringing emerging technologies to market, and then putting innovation into practice how privacy enabling technologies can be used to drive business outcomes.  I think a lot of us know your current job title has innovation and emerging technology in it. I think you’re the right person to have on the show today. I have confidence it’s going to be a productive conversation, if not slightly entertaining.

A little bit about your background for our listeners – you’ve pointed out on your online bio that one of the best ways to get to know you is through your nicknames. A couple of the ones that I’m aware of are:  XMLgrrl, SAML lady, and UMAnitarian…these are apropos considering your non-trivial role in the creation of XML and SAML (the federated identity standard) and recently UMA (a protocol for access management with privacy and consent built into the design).  Your online bio informs us that you’re a renowned strategist, innovator, and communicator on digital identity, authorization, security, privacy…a long list of things.

As V.P. of Innovation and Emerging Technology at ForgeRock, a leading identity and access management vendor, you drive privacy and consent innovation for their identity platform. You’ve guided ForgeRock’s implementation of UMA and its related standards such as Health Relationship Trust or HEART which I hope we’ll briefly discuss later in the show.  You’ve acted as an advisor to the U.S. Health and Human Services API task force. Before that, you spent some time as a Principal Analyst at Forrester Research. You were a distinguished engineer at PayPal, a Director and Standards Architect at Sun Microsystems and others. Eve, it should go without saying that our industry is very fortunate to have you and your genius in our midst and I’m lucky to have you on the podcast. Thank you and welcome to the Nonconformist Innovation Podcast. I really appreciate your time here today.

Eve: Well it’s a pleasure to be with you, of course. I am lucky to count you as a friend.  I think that your new podcast series is a brilliant idea.

Steve: Thank you, Eve. So, we could easily start off by talking about the genesis of XML and SAML and your role in the development of those technologies but those are nearly household names today like MTV and passwords. At least, I think they are within our industry.

Let me start where I first met you and learned about UMA in 2011 at one of the Internet Identity Workshops in Mountain View where all great identity related technologies go to incubate. I remember being at one of the IIW workshops where you gave an introduction, probably one of the first ones – correct me if I’m wrong.  You presented an overview and a live demo of an early UMA use case and there are a few interesting things here. You founded UMA, and have been the driving force behind the vision, development, and subsequent adoption of this emerging technology. It’s taken a while for it to catch on, at least from what I can tell.  Can you share with the listeners a bit how you got this idea to see the light of day up until now?

Eve:  Absolutely. First of all, there are actually a lot of “UMAnitarians“ (that’s one of my nicknames – “Chief UMAnitarian”). There are a lot of us who have contributed. And actually, at the time, several people and organizations had come up with similar visions, as often happens. So that included a project in the EU called the Prime Project and the Vice Chair of our work group for UMA, Maciej Machulak, who’s now at HSBC.  He was one of the folks who came up with a protocol to do something similar – but many of us found each other and here’s how we started to work on operationalizing this specific vision.

We decided to build on top of OAuth – at the time it was OAuth1.  A little joke that you sometimes hear in standards is – How do you know that you’ve got version 1 of a standard? The answer is – It doesn’t have a version number on it. It was just OAuth, at the time.  We’ve learned from that and put 1.0 on UMA.

Connect your disruption to some familiar ecosystem so that you can basically have a connection path to the past.

UMA tried to achieve several disruptive things at once. It was trying to empower people. There’s a user and user managed access and that’s the resource owner of OAuth. So, if you’re trying to do that you need to build on some known pieces. We had a series of design principles and one of them really was saying look, we’re building on something that’s known that was OAuth at the time. We also decided to be identity system agnostic. We didn’t want to reinvent that wheel. So just like OAuth is, we were about authorization or about delegation, but we didn’t want to have to reinvent any of that and we wanted to absorb whatever identity system was going to be that won. OpenID Connect didn’t exist at the time.  There was OpenID, there were a lot of people experimenting. You still had people looking at InfoCard at the time. There was a lot of SAML usage as there still is now. We basically said look, we’re going to build some explicit ways to absorb whatever is being used underneath for identity.  Maybe a lesson there is connect your disruption to some familiar ecosystem so that you can basically have a connection path to the past. I actually learned that from my XML experience.

Steve: That’s great! I mean all good technologies are built up on one another.  I mean, the Internet as you know has many different layers. I’ve heard some talk about how OAuth and UMA might be a little bit bloated but when you talk about authorization we’re not necessarily talking about milliseconds, are we? I think it’s more important to get authorization right with scopes and prevent unauthorized access. Is that a little bit of the thought process here?

Eve: Well,­­ yeah, you have to get it right, but also, I think everybody is sensitive to bloat, and in fact, UMA 1.0 was – I think you could say it was more inspired by OAuth – than built right on top OAuth. We actually learned our lesson. We did an UMA2.0, so there goes that version number… It was a much thinner shim right on top of OAuth 2.0 by the time we’re done with things. In fact the UMA group just contributed the UMA specs to IETF.  UMA is now a formal extension grant of OAuth, so it really is much thinner of a shim. And there’s another aspect as well that’s sort of optional. Those were contributed to IETF and we’re working with folks in the OAuth group to say hey, how can this just be sort of absorbed into IETF as a grant spec?

Steve: You make it sound so easy.

Eve: Well it’s not. No standards work is really easy.  It’s a matter of working and socializing and taking two into account that there’s a whole bunch of implementers now of UMA 2.0 so existing implementation experience along with the usual security considerations – privacy considerations, things that you learn from security analysis and market inputs. So yes, network.

Steve: Let’s talk about that for a little bit. As you said nothing is easy, but you have this track record of being involved in bringing new technologies to market.  A key aspect of nonconformist innovation, at least the way I see it, is this mindset that compliance is not enough and more can be done to open new markets, create new products for existing ones, and generally unlock new value that can differentiate your business. I look at SAML,  XML and UMA as such innovations. I don’t think there were any regulatory bodies or governments twisting your arm to do this kind of work.  So, on a personal and maybe professional level, how do you how do you justify enduring the hardship and the challenges of working on these standards that may or may not ever amount to anything?  I mean a mortal would never wish this level of uncertainty on herself or her fellow humans.

Standards are just kind of strategically commoditizing something that’s about to be commoditized so that you can build value on top.

Eve:  Well as I say fools go where angels fear to tread.  When I actually started working on XML it was a bit of what I feared might be a career-limiting move because it was sort of a bunch of us launched that work as SGMLers and it maybe didn’t look like so good an idea to the company that I was in – although it turned out to be  –  I think it contributed to a great exit for them. I’ve managed to make that repeatable as a success by presenting the great opportunity that it is if you do it right. It is a kind of disruptive innovation where you present an opportunity to expand the market dramatically and then be well positioned to grab a really bigger slice of that massively expanded market. So that’s  a neat trick if you can pull it off. It’s much more fun than grabbing a bigger share of a shrinking market, right? And then how you create that standard. And after all standards are just kind of strategically commoditizing something that’s about to be commoditized so that you can build value on top.

XML was an interesting example because it was actually just a smaller profile of its ancestor SGML. We were just chipping away all the parts of the marble that didn’t look like XML. And then SAML was interesting because that was at the dawn of Web services. SOAP had just come out and everybody’s scratching their heads wondering what to do with it. So, there was this new thing called Web services and it needed security and SAML was a cross domain security trick. Notice it didn’t have the I word in it we were talking about identity then. So SAML is the Security Assertion Markup Language and it’s a cross domain security trick. It was crossed from using single sign on and it turned out to be hitting that point of both disruption and connecting to existing ecosystems sufficiently that it took off which was not guaranteed, as you say, at all.  And then looking at UMA – it’s inspired by what Google Docs kind of does with its share button which is kind of like super-duper consent, only it’s kind of for building trusted user relationships and not just providing compliance. So, to your point about going beyond compliance, looking back over the last some years now, market forces and consumer discontent has sort of driven a need for GDPR beyond the original data protection directive engine and the EU and so on. Those have now in the modern era strengthened UMA’s momentum. There’s been many forces playing against each other. What I kind of figure is you basically must enjoy living in the future and being a little bit frustrated all the time to have that vision to go once more into the breach.

Steve: That’s right. You mentioned the strategic standardization. I like that and at the end of the day all has to come back to a revenue driver or providing something of value whether it’s faster, or safer for the consumer, in order for things to really take off and get interesting. So, looking back a couple of years you joined ForgeRock in 2014 as their VP of Innovation and Emerging Technology as we already hit upon. By the way, I think whoever hired you into this role is a genius considering that ForgeRock was only founded four years prior.  For a company at this stage to commit to an executive role in innovation in emerging technologies seems forward thinking. So, bringing you in at this stage to drive privacy and consent – this was pre-CIAM and pre-IoT –  was pure genius and with UMA kind of in the wings of your coming in. Was this the big break that you needed to help UMA become mainstream? In less than a year of joining ForgeRock UMA 1.0 was ratified and launched.

Sponsored by:

Listen via these channels

SUBSCRIBE TO THE PODCAST

We will notify you about new episodes and important updates.

Nonconformist Innovation insights in your inbox.

Eve: Well ForgeRock really was prescient in that, and you got to at pin all that bringing in on ForgeRock’s founder well, one of their five founders, and their CTO Lasse Andersen and our then CEO Mike Ellis.  ForgeRock has always deeply grokked consumer scale. It’s in its bones and its architecture helped it grok IoT and also consent management and consumer trust at an architectural level. ForgeRock launched identity and relationship management with the Kantara initiative which is where UMA is done and yes all of that helped accelerate UMA’s timeline including the implementation that we did. So since then we came out with things like a profile on privacy management dashboard, a single pane of glass view of all the permissions that an end user can manage including things that you think of as compliance-oriented but going way beyond that.

Steve: I’ve been asked within the last couple of weeks – how do you make sense of so many vendors in the marketplace within identity and access management? It’s a $7 billion market, depending how you slice it, and growing. One of the things that always sticks out to me about for ForgeRock is seeing around corners and being able to – like you said – build some of these privacy enabling capabilities right into the architecture. Now your company is positioned extremely well because of that in light of GDPR and California is soon going to be coming out with some of its own regulations in this area. It’s interesting. We’re living through this time in history where our entire lives are going online whether we like it or not. And consequently, the erosion of our privacy as a result of data protection and consent implications are just not being a part of the original design. Just last week we heard about this data breach of customer records at the University of Washington Medicine Division affecting more than 900,000 individuals. So that statement “We take your privacy seriously” is almost becoming a part of an ongoing joke in the industry.  Although ForgeRock has some early adopters of UMA, are recent breaches and privacy awareness creating the perfect storm for CISOs and stakeholders of health care organizations to take notice and action on UMA and their privacy-driven initiatives?

Privacy is not just encryption and UMAnitarians like to say that privacy is context control choice and respect.

Eve: Yeah. I think breaches are part of the story and they’re an important part. There’s really as I was alluding to earlier there’s intertwined consumer sensitivity and kind of cynicism and savvy-ness and also the regulatory forces. And so another part of the story beyond things like breaches are things like there’s been recent stories in the news about Google Nest products having cameras that you didn’t know about in airplane seat backs having cameras that you didn’t know about. And the way that I think about this is – some of my standards colleagues and I gave a talk about Privacy 2.0 at last year’s Identiverse where we have a model that you can understand. It’s kind of three layers of sort of a pyramid of what’s important about data privacy. At the base of that pyramid is data protection. Like in GDPR General Data Protection Regulation. So, data protection just in the sheer English phrase is about the security of personal data. That’s the stuff about not accidentally letting the data out and everybody really has to do that. You’d hope it’s table stakes. Obviously, it’s hard to do and there’s low maturity generally in any industry around it but privacy has to be more than that. And there’s actually a saying that UMAnitarians have about what privacy is and isn’t. So classically it’s easy to say that privacy is not secrecy. Privacy is not just encryption and UMAnitarians like to say that privacy is context control choice and respect. It’s much more than technology and you can’t keep everything secret.  And so how do you do that if you’re not – if data protection can’t be the whole part of the equation? So, the two other parts of this pyramid are data transparency – so thinking about things like GDPR now requires you have to tell people what you know about them and what you’re going to do with what you what you’ve gotten from them. That’s still not the whole story because they can’t do anything about that once you’ve told them presumably you just tell them.  The third part is data control: what do you let them do about it.

We’ve learned that consent has to now be more meaningful like GDPR. For example, will say we can’t just give them opt out. You can’t just show up and have everything pre-checked because that’s not fair. But the hope is that you can do more. So, UMA is an example of an architecture that lets you do more like totally proactive control like a Google Docs share button about what you’re sharing. The trick is that the top two parts of that pyramid are about business model. They’re not about what you’re hoping to avoid doing accidently.

Steve: You’re absolutely right about that.

Eve: I just heard that cameras, that somebody chose to put those in there and Facebook SDKs chose to let third parties get certain data.

Steve: I just came across a great statement, although not new it was new to me, one of the laws of software is that it is a reflection of the hierarchy and organizational chart of the company who created it.

Eve: Yes, it’s Conway’s law. It’s mind blowing when you think about in Conway’s laws from 1967. Your communications structures dictate what kind of software you write.

Steve: Consumer facing enterprises choosing technology might be well-served to think about that in the vendors that they choose and that they cobble together and that coming up with a solution that technology alone isn’t enough. We need to think a little bit more meta than technology. What is our mindset in terms of connecting these systems together that demonstrates that trust is our driving number one principle here and Privacy by Design is our number one principle when you know you’re building an identity system, consent management or CRM and how you manage all of the complexities of these systems and data on behalf of the customer.

Eve: I think you’re absolutely right. And I’m so glad that you brought up this law in this context because it impacts how companies think about privacy and their fiduciary duty to customers and to consumers who aren’t their money paying customers which is what a lot of these consumers are.  I’m going to have to think more about that. Oh man. Mind blown.

Steve: But it’s more than just what appears on the surface right? Companies will grow inorganically by acquiring other companies and then they inherit this technology and that risk which is not an IT problem right? That’s a business problem.  The business sponsors need to think about the risks – de-risking their M&A strategy for example to make sure that its consistent and aligned with the level of risk that they want to tolerate.

Eve:  Yes, and there’s a connection to privacy regulations when you look at the language there about data control.  Say the digital service that directly interacts with a consumer or a data subject and then data processors who are part of, if you will, the data supply chain who are a degree of separation away from that data subject. But in the data supply chain it gets all mixed up when you talk about API interactions with services and so on. Not a lot of good thinking has yet gone into API interactions, for example, either within a company or across domains and how it affects data controller versus data processor versus third party relationships. I’m getting into privacy regulatory terms of art here. But how that interacts with the fiduciary responsibilities of companies who have to be compliant and also how it interacts with that Conway’s law I think it’s important to get right and I think not enough thinking has gone into that.  I keep trying to do some of that thinking and I do it in the standards world some and I collect lawyers as I often tell people, jokingly, but it’s true I collect a lot of lawyers. It’s very handy.

Steve: But you know we’ve always known at an anecdotal level – the left hand and the right hand should be somehow coordinate. Even within the brain in our bodies and our nervous systems those are coordinated. And, within business and technology it just seems like more coordination is needed.

Going a little bit further along the lines of the work that you and ForgeRock have been doing in the health care industry…since just last week we’ve had this news of the data breach at the UW Medicine Center. ForgeRock and you have also been involved within the sector. We also are aware that the Office of the National Coordinator for Health IT recently came up with some new rules for data sharing and privacy standards. So ideally the only person who should have to take action to approve an app’s access to patient data or electronic health care information is the patient or the representative and now UMA is going to have an interesting role in enforcing this consent model, satisfying that reasonable security safeguards are in place in health care I.T. systems.

What do you see as the implications for health care organizations? What should savvy health care companies be thinking about in their planning and strategies now to ensure improved privacy and consent will exist in the future?

What companies should be thinking about given the new rules is robust consent actually and being able to unify privacy and consent architectures with authorization architectures so that you can properly externalize your authorization logic and really try and keep them clean.

Eve: What companies should be thinking about given the new rules is robust consent actually and being able to unify privacy and consent architectures with authorization architectures so that you can properly externalize your authorization logic and really try and keep them clean. That way you know what somebody has permissioned access to – actually one of the things UMA tries to do – the way it’s designed – is actually to bring a model of web access management (WAM) to individual people versus just an enterprise that’s in fact what its architecture looks like.

Interestingly the healthcare world innovated around a consent concept that is called Consent Directives. Probably a lot of us have filled out these paper Consent Directives if we go for surgery or something like that. There are the forms you fill out to say what you want to have happen not just to yourself but also to your data (your health data) if something happens to you. And I don’t think the folks in this field see this as earth-shattering, but I actually do because what those forms actually do in paper if you were to digitize them is that they’d look like authorization policy. They’re proactively made policy about where data should go, what it should be and who it should be given to.

Now there are some programs like in the government to do this called Consent to Share involving UMA and HEART profiles.  You mentioned HEART (Health Relationship Trust). These are profiles of OAuth and OpenID that could connect and UMA. HEART is a working group at OpenID Foundation for consumer-controlled or patient-directed health data sharing. I would say that this notion of robust consent of the sort – patient-directed consumer -controlled is the right place to start to avoid information blocking structurally and that’s what all these new rules are. There are two sets of new rules. That’s what those new rules are trying to prevent. And there’s a whole bunch of nuance to the rules and there’s some very good information online about them for anybody who hasn’t gone to look at them yet. There are a lot of infographics and everything, but the mindset is to just try to avoid information blocking from the get-go. So robust consent is the way to think about it.

Steve: You mentioned externalized authorization. Because you can scale these unique profiles by sector like HEART for health care and so forth, is there also an element of compliance proof there that you may enforce policy based on business direction? But for regulators you also have to say hey here’s how we’re doing it, and that sometimes even audit it as well?

Eve: Yes, funny you should mention that, it’s about sharing the data. It’s also about proving that you un-shared it and it’s much easier to do that if you’re not building the authorization logic app by app but you’ve literally externalized it so you can centralize them.  It’s for the same reason we have the PDP (policy decision point) and PEP (policy enforcement point) concept, which is roughly what UMA does as well. When I say when I WAM I kind of meant WAM. It’s the same concept only if you are directed by a patient or you know a health insurance subscriber to share or un-share data. You have the same motive including auditable proof that you did it or undid it that you do if it’s enterprise business rules that turn into authorization.

Steve:  So, you’re saying it’s okay to just hit the delete button?

We know that IAM is tough in the best circumstances for enterprises. Imagine when you have to scale it to consumers. You want to have similar kind of logic to apply to when you have to scale it to consumer scale.

Eve:  I mean it’s not OK. And I remember talking to some folks representing hospitals some years ago where all of the access control was – something like 63 percent of it was by exception. In my analyst days, it was just terrifying how bad the access management was just at the enterprise level. And now imagine you have to scale it to your patients or your subscribers or their payer provider, speaking in U.S. terms. It’s bad enough when you’re looking at a hospital institution or you’re looking at enterprise access management – it’s done so poorly.  We know that IAM is tough in the best circumstances for enterprises. Imagine when you have to scale it to consumers. You want to have similar kind of logic to apply to when you have to scale it to consumer scale.

Steve: That’s right. It’s just not getting any easier either. You talk about Web access management and legacy technology but within healthcare now you have physicians carrying their mobile devices. You have patients with health trackers. We live in this world now where consumer wearables and fitness trackers can collect all this sensitive health-related information and it can be used by employers to assess the health and potential risk factors of their employees among other things. This raises some pretty considerable privacy concerns and it is problematic that employers can now require high-risk employees to pay more in insurance premiums or worse punish them for not maintaining their health? We’re essentially providing the ability to track employees’ movements to their managers if it’s not managed correctly. We have a word for that – it’s called stalking.

Eve:  Seriously. Oh my goodness yes. And it actually was a few years ago I was doing the “TMI I’m giving a privacy talk” thing and telling everybody about a product I had just bought this is on stage at RSA and it was one of those kinds of like adjustable beds that came with the capability to track your sleep. So here I am telling everybody you know the story all over again. But I complained on Twitter about the fact that if you looked at the terms and conditions for the Smart sleeping app connected to the bed it was terrifying! They’re like “We own your biometric data”. And so there is nothing you can do about it. Oh and by the way “we can detect if somebody else sleeps in your bed.” And you can’t turn off this feature. I was talking to a friend of mine who used to be a Chief Privacy Officer and she is saying, “I smell divorce cases.” And come to find out you know you look at version 3 of the app and they’ve changed their terms of service and they’ve actually added functionality where you can turn off the tracking. At least they tell me they can turn off the tracking for all I know they’re still tracking it these days you can’t trust the companies.  So you hope that employers won’t be incentivized to do this but you know their incentives are such that the way health care costs run this is what they could be doing.

SUBSCRIBE TO THE PODCAST

We will notify you about new episodes and important updates.

Nonconformist Innovation insights in your inbox.

Steve:  I have heard other similar stories with iPhones or your iWatch. Now apps that are tracking your movement can also be tracked by spouses and at a pretty granular level.  So, you know you can now track to see if a spouse is in their hotel room – not to get too salacious here – but tracking the movement between hotel rooms at night could be pretty damaging to relationships I suppose. Not that that’s not already going on. It’s just that now with technology the cat is out of the bag so to speak.

Eve: It is. This is where you know we were talking about sort of business layer versus the technology layer. There’s a term used on some the standards work I do BLT sandwich. Business – Legal – Technical and you know we’re talking about data control and data transparency being business model concerns. To a certain degree, businesses are offering digital services that are valuable to people. IoT’s value proposition is the data. You could buy dumb devices. But a smart device’s value proposition is largely the data. And so, there’s a certain degree to which businesses are going to have to clean up their act. It’s best if they self-regulate ahead of regulation if they want to have some choice in the matter – you can see companies jockeying for this position around the barriers they put in front of what it is they can collect. I mean looking at Facebook portal you know having a physical barrier to the camera and things like that. They’re going to have to prove themselves at some point.

Steve:  And some of these devices weren’t designed with privacy by design. They weren’t designed to be used in compliance driven environments. My daughter uses this app called Tik Tok and they’ve gotten into a little bit of hot water of late. You know so yesterday on the drive home she was complaining and asking me if I deleted her app and you know she was she couldn’t access any videos or profile or anything. Well there were some violations to COPPA and they had to do to delete all of the profiles of their users under 13 because they didn’t have the consent of a guardian or a parent. In these unregulated environments things get suspicious and dangerous very quickly. Within regulated environments things don’t automatically fix themselves. You made an interesting comment about self-regulating. So, there’s these challenges. Let’s talk about those for a minute. On the other hand, the ability for wearables and sensors to provide lifesaving insights and crucial data in real time and consequently automating appropriate care is profound. It seems to me that data sharing and consent are the critical factor – if not the most important factor – here in enabling health care companies to foster consumer and patient trust in this new world of devices and sensors and improving wellness and quality care. So that seems like a no brainer to me. You know if  –  and that’s a big if  –  we can get constrained delegation to apps and consent issues worked out or built in as it were. What do you see as the other key challenges to adoption of devices and privacy enabling technologies and these consent models where UMA can play a part?

Eve:  I have some insight into this partly because of my work with a joint venture called the OpenMedReady Alliance with some folks who are very familiar with the health care business and what we’ve been trying to do there is sort of put together the glue specifications that might be needed among things like you know FIDO specs and Bluetooth and OAuth, UMA and HEART and involving from software to hardware and the whole IoT picture. And what I’ve learned there is that there are a number of challenges largely having to do with how you solve the device provenance picture, or the health care people say “provenance”. There are ways to solve that problem, but you do have to sew together quite a lot of standards to make it happen. And a lot of pieces are actually in the picture because the FHIR API exists. That’s the Fast Healthcare Interoperability Resources API. The open API that’s in the health care picture. We’ve got a lot of authentication standards that have already been lined up. We’ve got a lot of identity standards. You know we’ve got a lot of security standards from the API world (the OAuth stack including UMA) and really just profiling them together would solve a big part of the problem. So I think that we’re perhaps on the cusp  in the next couple of years of some breakthroughs and what will help is some more maturity and really the EHR side of things. And you know maybe these rules will help because the rules around information blocking, I think are trying to dissolve some of the reluctance when it comes to business models on the EHR side of the equation.

Steve:  So that sounds to me like it’s you know the problem or the challenge of it being you know having to cobble together kind of like the IKEA model right you can order all of the pieces of how you can put them all together is the craftsman’s role, right?

Eve:  Yeah yeah. It’s sort of a friendly and efficient self-service as my dad used to say. So yeah do it yourself. It shouldn’t take too long. You just need that one Allen wrench that rolled under the couch.

So that’s where OpenMedReady has actually been. I’ve been trying to solve some of the challenges and put together some demonstrations of what it can look like. Maybe that will help. I hope it will.

Steve:  I hope so too. As we wrap up, what conversations should CISOs be having with business stakeholders and steering committees right now in this regard?

You have to figure out the intersection of your digital transformation goals and your user trust risks or gaps.

Eve:  You’re absolutely right that it’s a multi stakeholder challenge within every enterprise and there’s four broad steps that I have figured out in talking to a lot of customers to recommend how to go forward for building trust and going beyond compliance. The first step is you have to figure out the intersection of your digital transformation goals and your user trust risks or gaps. There are companies who have gone to market with some cool consumer IoT solution but forgotten to ask that question which is very expensive. So, business owners are really gung-ho to do something and then a risk lead comes in and says “Oh but” so you’ve got to have that conversation.

Number two you need to start conceiving as an enterprise of personal data as a joint asset. Now a risk lead will say you know quite comfortably, “Oh look you know GDPR says this is really owned by the data subject” but that’s a little bit facile from the perspective of just human nature. Business owners should like “need the info” and “really want the info” and that’s why I suggest joint asset. You know, sort of come together and empathize with your consumer because everybody is a human first.

Third, lean in to consent as a choice. It’s not easy to make the choice to ask your end user for consent. And there’s times and places not to ask, thinking of GDPR specifically. Now there’s more regulations you know on the horizon and right in front of us with things like the California CPA.

But try and give the consent choice to people.

Identity and access management and the things that these protocols are here to do, they are to make your life easier. They tend to be well vetted by a lot of experts worldwide, and are there to accelerate your time to success.

And then fourth, take advantage of consumer identity and access management for actually building trusted digital relationships. I mean you can do it app by app but this is where not only CIAM but also tools like UMA can actually help you externalize logic that’s hard to do. You know we go around telling people that you shouldn’t build security code you should build secure code, and don’t invent your own crypto library, for example. Identity and access management and the things that these protocols are here to do, they are to make your life easier. They tend to be well vetted by a lot of experts worldwide, and are there to accelerate your time to success.

Steve:  And by easier you mean helping CISOs to get a little bit more sleep at night but also easier for your customers as well. I think peace of mind knowing, hey this company is really doing some things right, and I feel that they make my privacy a concern of theirs, and rightly so.

Eve:  Right. Well absolutely. You know how to scale the security higher and faster; a way to do that is to use security standards.

Steve: Right. That’s what they are there for. In wrapping up, is it possible to see a live demo of the UMA protocol with HEART profiles in action, and perhaps with the API standards like FHIR that you mentioned working today? Where would someone go to get in touch with you or someone from your team took to learn more about it?

Eve:  Yes absolutely. So first of all ForgeRock has a demo of UMA along with identity relationship management concepts. All the relationships among people, among people and devices and cool important tricks that you can use for dynamic authorization of access using a consumer-controlled health and IoT data sharing use case that we’ve nicknamed HealthServ. And of course, we’d love to show it to people. I’m XMLgrrl (2 rrs no I) on Twitter.  So, people know how to get a hold of me hopefully, that’s an easy way to do it. And there’s a whole bunch of health enabled and health focused implementations by now. I’d invite people to get in touch with me and you know I can and I’m happy to share that knowledge.

Steve: I’ll put some links to XMLgrrl and add that as a resource on the podcast web page.

Eve, this has been awesome. Getting to hear more in detail on the work that you’re doing and to hear what you’re working on. So, thanks so much again for your time today and I hope we can do this again.

Eve:  It’s been such a pleasure.  Thanks so much, Steve, for the opportunity.

SUBSCRIBE TO THE PODCAST

We will notify you about new episodes and important updates.

Nonconformist Innovation insights in your inbox.