Facebook EmaiInACirclel
Strategy

The Programmable Company: Identity and Access Management, a Pentalog Standard

Cornel Fatulescu
Cornel Fatulescu
Chief Platform Officer

Identity and access management is one of the most complex features that many of us take for granted. As a result, organizations might not just neglect some of the associated risks, but also overlook many of the digital opportunities in our highly connected world.

Identity_Access_Management_Pentalog

This article will help explain why Pentalog Technology Office has drawn a hard line in the sand for architectural standards. From now on, our default authentication and authorization solution will be based on leveraging identity and access management products.

This means that anything different in these regards needs first to be reviewed by our Technology Office, then approved by the Security Team, and finally, customer consent needs to be formally traced in our collaboration history.

Here’s what this article provides:

  1. To make our points and present the argument for the standard, we start with a short tour of the recent history of market-driven selection for authentication best practices, up to the popularity of MFA.
  2. Next, we will add the evolution of identity protocols – the most important sequence in our argument.
  3. Then, we will conduct a brief tour of serverless identity providers that Pentalog favors as default options.
  4. As a conclusion, we summarize the changes that apply starting from the second quarter of 2021.

 

Authentication Best Practices up to MFA

To keep this timeline short, we will focus on the practical considerations of authentication in computer science rather than the underlying cryptography, because its roots go deep in our history.

Identity and Access Management History

The recent history of market-driven selection for authentication best practices, up to the popularity of MFA. Try to identify where your authentication mindset sits in the timeline – it’s a great starting point before the discussion of protocols.

  1. The History of Authentication

    During the Sixties, secrets protection become an important discipline in computer science. It was password-centric, and if the system was hacked, it was easy for attackers to extract the passwords from the file or database.

    In the Seventies, the problem was solved with a cryptographic concept called hash function, which enabled password verification without actually storing the password itself. But if this mechanism was sufficiently good for the time, it was still vulnerable to basic brute force attempts.

    Later, scientists concerned with securing the data being transferred gave birth to public key cryptography. Unfortunately, those solutions were mostly used just by modern government infrastructures. Nevertheless, this was the catalyst for digital certificates, signatures and RSA asymmetric key algorithms.

    During the Eighties, attackers could still guess, steal or intercept passwords with ease. Therefore, a new concept emerged called one-time passwords, which later led to protocols like OAuth. The basic idea was to create non-predictable passwords but in ways that they can still be validated.

    These passwords were delivered to users through specific hardware or special communication channels. Today, we call this “Two-Factor Authentication” (2FA) and Multi-Factor Authentication” (MFA).

    Public key cryptography was a very powerful concept that led to questions like the confidence in the issuer of keys. During the Nineties, trusted third parties that issue such keys or digital certificates became commercially available.

    By 2000, 2FA was popular in enterprises and ‘more than one factor’ was adopted as standard authentication.

    It took another 10 years until the smartphone industry brought 2FA to consumers. That was the moment when 2FA became cheap as it didn’t require new hardware any more. A simple text message was good enough.

    Behind the scenes, tremendous innovation brought by the development of big data and machine learning enabled AI security information and event management tooling.

    Today, MFA is the standard in authentication for consumers and AI “Security Information and Event Management” (SIEM) became more and more popular.

    Keep in Mind

    • The overall authentication landscape is much more complex than the description above. But what is important to know is that developers rarely have the full background, as it is almost impossible to stay up-to-date with what they need to learn about emerging concepts required in their software development jobs.
    • In addition, this timeline does not include other large and highly sophisticated fields of infrastructure and data, and it barely touches the surface of the body of security knowledge.
    • We emphasize this because asking developers to choose solutions in these areas without the help of security and infrastructure experts is a high risk for the organization.
  2.  

  3. Protocols

    The Nineties was an exceptional time when local networks proliferated. It was a moment where every application implemented its authentication and coincided with the consolidation of IT departments.

    It is in this period as well that some of the most modern organizations managed to give the feeling to users that, once they signed in to the system, all the applications knew the identity of the user.

    This was possible because of the existence of something called Domains Integrated Authentication. This authentication was hardly relying on its infrastructure. If the PC is part of a network, the network knows who you are.

    This architecture introduced a very important concept called Domain Controller: a special machine with the role of knowing everything about the local network.

    This architecture came with many challenges. The operating system might not support the capability. It was in some cases not possible to bring your device or include the temporary devices of partners.

    So, sharing resources across networks was not supported by Domain Controllers. And with the increasing number of large external applications that businesses needed to integrate with, it became impossible to rely on a single infrastructure for authentication. This is how the idea of identity provider emerged, as an equivalent of the domain controller for the internet.

    Cross boundaries – Identity Provider

    Claims-based identity was born to address limitations of the network boundaries’ domain controller – as you can see in the image below.

    Identity Providers

    This is how identity information spans across boundaries. Take note that this is a high-level overview. We skipped many details on purpose.

    • An application reads the identity provider metadata.
    • The identity provider is typically in an external network and the application is configured to trust the identity provider.
    • The user of a classic web browser application is automatically redirected to the identity provider for sign-in.
    • If the transaction succeeds, the identity provider releases a special string called a token in the format dictated by the protocol of choice.
    • The token is sent to the application, which verifies its validity.
    • If everything is fine, the application can extract claims describing the user, and use this information for further authorization to application features.

    Single Sign-On (SSO)

    To this point, we’ve looked at the cross-network boundary problem. But organizations were facing other challenges, like Single Sign-On (SSO).

    Cross-domanin SSO

    Consider an application that has many links to three other applications for detailed operations. After the user signs in, you wouldn’t want to prompt the user again for authentication when accessing the other three other applications.

    This is how SAML or Web Services Federation (WS-Federation) protocols emerged in the early 2000s. And while SAML evolved into its second major version of SAML2, WS-Federation was too complex for wide adoption and too limited for modern web applications. This will lead to its disappearance.

    The Access Delegation Problem

    For the benefit of the user, social networks and various other modern web platforms request user credentials to connect to other systems on that user’s behalf.

    Access Delegation Problem

    For example, when you change from a social network to another one, you might want to import contacts from the previous one or your mail address book. Sharing email passwords with other third parties is very risky. This kind of information allows access to sensitive accounts like online banking or social security.

    OAuth_Identity_Access_Management

    Important technology companies like Google or Twitter released the first version of the OAuth protocol in 2010. Today, the viable protocol is OAuth2, and here is how it works:

    • A user has access to two applications that we will call Application1 and Application2.
    • Application1 has features that allow it to access resources managed by Application2.
    • Application1 will redirect to an authorization server called Authorization Endpoint.
    • This authorization server responds to the request, asks the user to authenticate, and if the operation succeeds, asks for user consent for delegating the access of specific resources to Application1 (OAuth dialog).
    • This returns to the browser an authorization code, which is then shared with Application1.
    • Application1 connects to a different endpoint to prove its authenticity and share the received code.
    • If the authentication of the application succeeds, the authorization endpoint returns a token to Application1 so that it may connect to Application2.
    • While the token is valid, transactions between Application1 and Application2 may take place to access only the resources for which the user gave consent.
  4.  

  5. Serverless Products

    Today, it’s hard to imagine a business that doesn’t need to leverage the security features we have looked at.

    If the new application is a success, a new one will appear next to it. Both of them will have to share the same user pool. And the user pool will be used by the increasing number of new tools that quickly add up for the business to function properly – tools such as CRM, ERP, office tools, or any productivity collaboration suite. And then MFA will be required, and so forth.

    We recommend that our customers do not reinvent the wheel, but instead, consider an industry-proven identity and access management solution.

    In the last two years, most of the User Services implemented by Pentalog were based on the Microsoft Active Directory family of products or Cognito AWS. These are the default options for public cloud solutions.

    Of course, customers are free to use whatever product makes the most sense for them in their context (see Okta, Ping Identity…).

    The customer makes the final choice based on our advice, because these services will become part of the customer’s infrastructure, and it is the customer who pays the bills for these resources, not Pentalog.

    Customers should also double-check for compliance and security with their staff. The overall backup strategy, environment management, specific permissions, or downtime policies must be reviewed during the selection process.

  6.  

  7. As a Conclusion

    With the products mentioned above, for a new application that requires basic user sign-in and access control, between five to ten man-days of DevOps are required on average. Refactoring after a couple of months of go-live might mean more effort than the energy needed for the product development to that point.

    Some sophisticated authorization flows, even for new applications, can require 2-3 months of development, configuration, and testing. This is the exception, but if the solution was not designed from the start as it should be, refactoring could be estimated in quarters.

    This standard emerged from customer collaboration with significant involvement from the Technology Office, Customer DevOps, Infrastructure, and Security departments.

    This standard is required to ensure the foundation of default solution architectures that we propose to our customers.

    It is a new era, where customers want to move faster and faster. If asking the same questions, again and again, has value, making the same mistakes doesn’t.

    Deviating from this standard should only be possible through escalation to the Technology Office, with subsequent validation from the security team and formal validation from the customer traced in our collaboration history.


Leave a Reply

Your email address will not be published. Required fields are marked *