First you have to prepare appliance. Let’s make new application Windows Remote Desktop.

(Tip! Use vpn or windows word in the application name and appliance will load corresponding pictures automatically)

I have checked “Biometric check support” and “Force biometric check” boxes. That will ensure fingerprint or face check during the log in process.

When you have application created you can define users or you can add your an external user source.

You can choose from those three:

Here is some basic setup AD configuration. More…

If user will have enough rights, the new user source will be added. Onboarded users will be synchronized with AD. Later with users onboarded you can configure escalation mechanism or multi user approval. More…

To make new application visible in the Notakey Authenticator configure onboarding requirements. If phone numbers are configured in AD you can use onboarding with sms or whatsup msgs.

Use any available method or combine them. In this case sms method is enabled. In some countries SMS service might be unreliable, so choose an other method and let us know. If you use simple credentials method and want users to authenticate against AD select edit link after enabling this option and check Authenticate against remote AUTH user sources box.

Onboard Notakey Authenticator to this newly made service.

Now create API access credentials to allow WCP to connect to the Notakey appliance. Add a new client.

Leave scopes field empty. More about scopes find in the API manual. Client ID and secret will be generated automatically.

Appliance part is ready now.

Now let’s go to Windows machine. First make a new registry file with access credentials for the appliance. You can edit windows registry directly as well.

Windows Registry Editor Version 5.00
[HKEY_LOCAL_MACHINE\SOFTWARE\Notakey]
[HKEY_LOCAL_MACHINE\SOFTWARE\Notakey\WindowsCP]
"ServiceURL"="https://demo.notakey.com/api/"
"ServiceID"="29afa6c0-907d-40f4-898f-8a96fcdf7230"
"ClientID"="c6df76db-10a7-4349-8e05-8b3656a4d404"
"ClientSecret"="pZ5lSo1Rz_ftCDg4h302794ZlmjMO64sVtiAvHop6dI"
"MessageTtlSeconds"=dword:0000001e
"MessageActionTitle"="Winlogin"
"MessageDescription"="Proceed as {0} on server {1}?"
"AuthCreateTimeoutSecs"=dword:00000014
"AuthWaitTimeoutSecs"=dword:0000003c

Copy this content into the file wcp_notakey.reg More…

Double-click on your registry file and upload new settings. Open registry editor and check if everything is loaded.

Download the newest Windows credential provider from Notakey and install it. Restart windows and log in with old method. Then open the Registry Editor, and navigate to HKEY_LOCAL_MACHINE > SOFTWARE > Microsoft > Windows > CurrentVersion > Authentication > LogonUI and copy the value of LastLoggedOnProvider

Lock machine and choose Notakey Credential Provider, enter your username and password. Notakey will ask you permission to encrypt your password.

No we have to disable default credential provider. To do that open registry editor again, navigate to registry key HKEY_LOCAL_MACHINE > SOFTWARE > Microsoft > Windows > CurrentVersion > Authentication > Credential Providers, find provider’s key (similar to what you copied from LastLoggedOnProvider previously) Right click on a credential provider’s CLSID (which should be disabled), and add a new DWORD (32-bit) Value with the name Disabled and value 1.

You can do this in group policy as well. More…

Now you have only Notakey credential provider left. Next time you will try to log into your system only username and second factor will be asked.

Thats’s all.

Thank you for choosing Notakey.

Configure the VPN concentrator to authorize users against Radius server. Test it and if everything works as it should add Notakey in between to provide second factor.

Create an VPN application in the Notakey authentication server if you don’t have it done already. More …

Copy access ID , you will need it later.

On-premise version setup

You have to configure built in auth-proxy service. Here you can find reference to all possible configuration settings: Authentication Proxy

Point your concentrator to NAA instead of Radius server.

repare configuration json, change “vpn_access_id” (Application access ID in the Notakey appliance dashboard, see picture above) and “vpn_radius_address” (Your real Radius server IP).

{
"vpn_port_in": "1812",
"vpn_port_out": "1812",
"vpn_radius_address": "10.0.1.23",
"vpn_secret_in": "secret_from_concentrator",
"vpn_secret_out": "secret_to_radius_server",
"vpn_access_id": "c468f008-485b-b11c-aad9672fbae1",
"vpn_message_ttl": "30",
"message_title": ""Proceed with login to VPN as {0}? 😎"",
"message_description": ""Allow {0} login?""
}

Feed configuration keys with command:

ntk cfg set :ap '{
"vpn_port_in": "1812",
"vpn_port_out": "1812",
"vpn_radius_address": "10.0.1.23",
"vpn_secret_in": "secret_from_concentrator",
"vpn_secret_out": "secret_to_radius_server",
"vpn_access_id": "c468f008-485b-b11c-aad9672fbae1",
"vpn_message_ttl": "30",
"message_title": ""Proceed with login to VPN as {0}? 😎"",
"message_description": ""Allow {0} login?""
}' --json-input

and restart auth-proxy:

ntk ap restart

We are using “:” before the ap key, so this setup is stored globally in the cluster. If you have many VM you just need to restart ap service on each of them.

User name should be the same in the Notakay appliance and in the Radius

Cloud version setup

You have to use dockerized auth-proxy service in your local network as a proxy to the radius server. Communication in between auth-proxy and Notakey appliance is encrypted and safe to be used outside LAN. Example:

docker run --name auth-proxy \
-e NOTAKEY_HOST="https://2fa.your_company.com/api" \
-e NOTAKEY_ACCESS_ID="c468f008-485b-b11c-aad9672fbae1" \
-p 1812:1812/udp \
-e LISTEN_ADDRESS=0.0.0.0 \
-e ADDRESS_SECRET_IN=0.0.0.0:secret_in \
-e DOWNSTREAM_ADDRESS=10.0.1.23 \
-e SECRET_OUT=secret_out \
notakey/authproxy:latest

You can configure more than one concentrator with different secrets by adding more -e ADDRESS_SECRET_IN=0.0.0.0:secret_in variables.

Mikrotik ROS has it’s own scripting language, powerful enough to use it for different tasks. In this repo notakey/mikrotik-2fa-vpn we can find all necessary components to make it happen.

1. Install JParseFunctions library.

[admin@router] > /system script add name=JParseFunctions

[admin@router] > /system script edit JParseFunctions

and copy content from JParseFunctions.lua into this script. In winbox just go to system/scripts add new, name the script and copy same content into the Source: field.

Repeat the same steps this with NotakeyFunctions.lua

These are two base libraries that allows Notakey authentication script to run.

2. Create new PPP profile with 2FA enabled

[admin@router] > /ppp profile add name=2FA local-address=VPN_Pool remote-address=VPN_Pool  address-list=vpn_pending dns-server=1.1.1.1

Configure this profile according your needs. Just remember to add field address-list and name it vpn_pending. Newly established VPN connection’s ip will be dynamically added to the list vpn_pending.

Add firewall rule that allows connections to the Notakey appliance. If you will connect with your smartphone without this rule you will not be able to accept your connection as all traffic will be blocked. And add the same rule for your DNS server.

[admin@router] > /ip firewall filter add chain=forward action=accept dst-address=ntk_appliance_ip

You can create a list vpn_allowed and use it for all addresses you want be accessed without approval from your smartphone. In such case use dst-address-list instead of dst-address

Add firewall rule to block any requests from addresses in vpn_pending list, like this:

[admin@router] > /ip firewall filter add chain=forward action=drop src-address-list=vpn_pending

Create an VPN application in Notakey authentication server if you don’t have it done already: https://documentation.notakey.com/nas/#applications

We have to add “up” and “down” scripts to this profile to initiate a second factor request to the Notakey appliance.

[admin@router] > ppp profile edit Notakey2FA on-up

and copy # OnUp script part from PppProfileScript.lua into this field.

Do the same for # OnDown script part from the same file.

[admin@router] > ppp profile edit Notakey2FA on-down

You have to change two values in this script:

  1. Notakey appliance address $ntkHost
  2. Application access id $ntkAccessId

[admin@router] > ppp profile edit "Notakey 2FA" on-up

You can change msg what will be sent to smartphone as well.

Create new user and assign the newly made profile to it:

[admin@router] > ppp secret add name=ntk_user service=l2tp profile=Notakey2FA  password=secret_password

In case you are using Radius server as external user source, please remember to set this profile as default for corresponding server.

User name should be the same in the Notakay appliance and in the Mikrotik router (or Radius)

That’s all.

Thank you for choosing Notakey.

Does your organisation need multi-factor authentication? Yes. But what next? How do you choose the right solution for your specific situation? There is no one-size-fits-all answer!

This guide will outline the different aspects of multi-factor security, which need to be taken into consideration, and help you evaluate them in the context of your requirements.

So what should be considered?

Typically, organizations need to consider their threat model (and the security impact), privacy requirements, as well as availability, feature-set and cost constraints.

Additionally, you may want to consider the direction in which the overall security landscape is moving. Is your solution future-proof, or will you have to change it in a year?

For simplicity’s sake, we have highlighted 3 factors, which – according to Gartner1 – are the leading product selection criteria among the leading IT companies.

Afterwards, we have prepared a technical appendix, in which to explore the remaining requirements in more detail.

Trust

You need to trust your security products, but where does trust come from? The technical strength of a discrete authentication transaction is important, but trust is about much more than that.

Trust is also about control. What data does the solution require, and where will it be stored? On your premises, or in the cloud? Is the code available for audit, or is it a black box? Nowadays data is currency, so evaluate carefully what data your solutions require from you. Where they will put the data, how they will store it, and what they wish to do with it. This is the solution’s data model.

Also consider vendor reliability and support. Will your vendor provide technical support? Consider the terms, SLA, and also use cases. Will your vendor support your use cases directly? Or will you have to assemble a solution from components provided by many different vendors?

To trust a solution, you need to trust its security model, its data model and its support model.

Trust the Security Model

The security model of a solution is not a decisive factor on its own, but it is certainly important. The technological underpinnings of a solution can severely limit its real- world applications, impact and scalability. Crucially, the technology should never rely on human behaviour – e.g. keeping secrets safe manually, or remembering and never divulging them. Expect human operators to be the weakest link in the chain, and never expect anybody to act against self interest.

The technological foundations

A fundamental choice each solution has to make, is what technology to base itself on. Typically the options are:

  • Home-grown cryptography. It may try to account for future constraints (e.g. quantum computers) at the expense of current ones, or in the worst case – it may be reinventing the wheel. Consider if you want your production systems to be used as a research experiment.
  • Various shared-secret schemes. The technology is inherently limited, and it relies on human operators to keep secrets safe
  • Public-key cryptography (PKI). While not perfect, only solutions based on PKI are provably immune to brute-force attacks. They are also the least-reliant on human operators. PKI has a long track record of being used in the real world – including the banking sector, e-commerce and military.

Consider who is in Control (pt. 1)

Being in control means knowing where your solution is hosted, what other services it communicates, and finally – knowing what goes on under the hood. Above all, it means having certainty about these things, and being able to verify everything yourself.

On-premise or Saas

Consider your infrastructure. Does your organisation have on-premise tools, hardware or applications, which need to be tightly integrated with MFA? Are your users stored in Active Directory? Do you want to add MFA to your VPN concentrator, which is accessing your RADIUS servers?

  • Verify if your internal resources can be integrated with the chosen MFA solution.
  • Check if your corporate security policies are compatible with any external requirements.
  • Consider which country hosts your data centers. Are they legally allowed to? (see the next point).

Auditable communications

What other services is your MFA solution communicating with? Do you know for a fact, where it sends data and why?

  • Consider if your solution has clearly documented network communications.
  • Will you be able to notice deviations from the official network communications model?

Consider who is in Control (pt. 2)

What goes on under the hood?

Trust is earned. If necessary – can you find out for certain, what is going on under the hood? What is your MFA solution doing with its cryptographic keys? How it is verifying transactions? Can you be absolutely certain it has no backdoors?

  • Prioritize solutions with open source components.
  • Prioritize audited solutions.

Transaction verification

Can you independently verify transactions? Or does your MFA solution simply respond with “OK” or “NOT OK”?

  • Can you access the technical response, and verify it independently?

Don’t Forget About Regulation

In the European Union, when it comes to regulation, companies need to consider the requirements of Revised Payment Services Directive (PSD2) and General Data Protection Regulation (GDPR). PSD2 stipulates strong authentication for FinTech companies, whereas GDPR limits what and where personal data can be stored. Do you trust your MFA solution to be compliant?

Where will the data be stored?

Does your solution have a SaaS business model? If so, consider what data it expects from you, and in which countries it will host this information. Does it comply with GDPR requirements?

  • Consider where a SaaS solution is hosted.

Do you need to hand off personal data?

Does your solution store user information, including personally identifiable information? Can it work without this information? Can you easily delete this information, if GDPR mandates it?

  • The less data an MFA solution needs, the better.
  • Can you easily bulk-delete user data?

Fintech customers

Are you a FinTech company in the EU? You may need to be compatible with the strong authentication requirements of PSD2.

  • Will your MFA solution be used by your customers?
  • If so – does it qualify as a strong authentication provider under PSD2?

Resiliency and Availability

An essential part of trust, is having the product work when you need it. Can you count on your chosen MFA solution to be highly available?

  • Does the solution have high technical availability?
  • Does the solution remain usable under heavy load?
  • SaaS – what is the promised SLA? What is the historical SLA?
  • On-premise – are there tools and documentation for maintaining a highly available setup?
  • On-premise – is the highly-available configuration supported natively? Does it depend on third-party products? (see section about hidden costs)

Total Cost of Ownership

This is what every business boils down to. Everybody needs to keep costs low, and the trick to success is accurately measuring the true costs of various day-to-day issues.

When it comes to software products, there is only one visible cost – the advertised price (the license fee for on-premise products, or the per-user fee for SaaS products). However, it does not come close to telling the whole story.

Capital expenditure can include user and administrator training costs, as well as hidden up-front costs for hardware, or dependent software licenses.

You also need to consider the operational expenditure – day to day administrative costs, implied cost differences from changes in employee productivity, support costs, feature upgrade and technical maintenance costs.

Evaluating a product’s TCO is about paying attention to the fine print, anticipating your future requirements and accurately estimating changes in employee productivity.

Beware Product Dependencies

It can be tempting for product vendors to hide fees in the fine print, while advertising a lower price. You can avoid falling for it if you know what to look out for. Hidden fees are typically hidden under hardware costs, support costs, and software dependencies. Consider if the product can be used on its own, as-is, or does it require certified auxiliary products for certain features?

Dependencies on other products

The ideal product can cover your requirements on its own. Double check the fine print for dependencies on other software – either first-party or third-party.

  • Does the product need a specific, licensed operating system to run?
  • Consider external requirements – databases, API servers, service providers. Does the evaluated product need any of this? Is any of this required specifically to help the product provide its features?
  • Does the product provide its own high availability features? Or do you need additional software for service discovery, clustering functionality, health monitors etc.?
  • What about additional hardware for load balancing, and assigning floating IP addresses?

Dependency Maintenance

Maintenance fees

  • Would you have use for the additional products on their own? If not, consider them a part of the product being evaluated.
  • Will the dependencies cost you time & money to maintain?
  • Consider the time it takes to train people to use and maintain additional components in your IT infrastructure.

Human Resource Costs

Every product has indirect costs due to training (both day-to-day usage, and maintenance), and support. A product that causes many support requests, is indirectly more expensive. Conversely, if a product causes employees to spend less time requesting support, and improves their overall productivity, it offers better value for money.

Day-to-day use and maintenance

Consider support response times. How long will employees have to wait, in case of an outage?

  • Is it possible and realistic to solve problems in-house, in case support is taking too long? Does the vendor provide tools/documentation for this?
  • SaaS – consider availability history (is this data available? If not – why?)
  • On-premise – how often will you need expected manual attention? Can these tasks be automated?

Time to Deployment can be Costly

Not every solution can be set-up easily. How quickly can you start testing the solution in a real-life scenario? How quickly can you scale the test to 10 people? To the whole organization?

Technical Set-up time

Consider how easily the solution can be “turned on”, including all the minutiae. This includes signing up for an account, configuring client security, configuring firewalls, finding software that adds the required security, or finding software-development kits and writing the required functionality.

Consider if there are pre-made software components for your specific use-case (e.g. VPN, Windows access etc.).

  • Consider if you can start using the product in 1 day? If not – are there compelling reasons?

Short excerpt from real life

Be Careful with SMS

SMS is a historically popular one-time-password delivery channel. However, be aware of its two major shortcomings:

  • It is unnecessarily costly to send SMS OTP for every transaction. Alternative delivery methods can be more efficient and cost less.
  • It is a very insecure delivery channel. SMS messages can be easily redirected. They can be easily intercepted. They can be easily redirected to other phones.

Don’t Believe?

The US National Institute of Standards and Technology’s (NIST) has been recommending against using SMS for multi-factor authentication solutions since July 2016.

Prior to that – in 2012 – Australian Telecommunications Alliance declared SMS insecure and inappropriate for high-value transactions.

https://www.itnews.com.au/news/comms-alliance-says-sms-unsafe-for-bank-transactions-322586

There are numerous horror stories about SMS-based security gone horribly wrong. Just google yourself.

Be careful with SMS. Consider if you have an alternative, and complement SMS with additional layers of authentication.

Opportunity Costs

Product feature sets open new opportunities. Other opportunities are closed, however. For example – by implementing a company-wide security policy based on smartcard, you get all the associated benefits, but may prevent your company from utilizing a lower-cost or more efficient product where applicable. Always consider what your chosen solution implies for the rest of your tech stack – now and in the future.

Consider the future

  • Does the solution align with the latest trends? Will it be relevant in the near future? Will other technological products in the near future be compatible with it?
  • Consider if the chosen product precludes the use of some other technology.
  • Consider how easy it is to migrate away from a chosen product, if the need or an opportunity arises.
  • Consider your business – are you heading in a specific technological direction? Will it be compatible with your chosen MFA solution?

User Experience

According to Gartner – user-experience in many cases is the deciding factor. It is not enough to be trustworthy and cost-effective, software products still need to be enjoyable to use. This makes sense – software products can easily be “secure enough”, and their price can be “good enough”. User satisfaction, however, can always be improved.

User experience is important for every software product, but doubly so for security products. Heightened security typically makes a product more cumbersome to use than products from other categories. Products that provide authentication are typically used often-enough for a bad user-experience to translate into real annoyance.

If users start disliking a product, they may try to avoid it or circumvent it, undermining technology, which “on paper” is highly secure. Alternatively, users may disengage or put off doing certain tasks. This can undermine the product’s cost-effectiveness through lower customer engagement. For example, consider how likely users are to use online banking daily when it is protected with a hardware token (cumbersome experience) versus a smartphone’s fingerprint reader (better experience).

Clearly, user experience matters.

Hardware Experience

If users avoid security products, then security is compromised. A good product should get out of the way, not require too much focus, and generally perform as many tasks as it can, without asking for user input. Otherwise, they may avoid, or circumvent the product, thus compromising security. The accessory type plays a huge role in this.

  • Consider the hardware involved. Is the MFA product based on the smartphone-as- a-token, or on USB keys? Or traditional hardware PIN pads? Does the MFA solution support different types of hardware natively?
  • Consider your threat model – what is the easiest user experience you can afford?
  • If your users need to approve transactions often, a hardware PIN pad may put them off.
  • If your users need to perform infrequent, but sensitive operations, maybe a hardware PIN pad is appropriate.
  • When considering frequency of use, consider what happens to lost devices. Can they be revoked? Will the user notice, if the device is missing?

Software Experience

Software also has a huge impact on the user experience. A smartphone accessory, just like other types of hardware, can offer wildly different user experiences. Compare SMS-based one-time passwords to QR code scanning, to push-based approval. Even push-based transactions can differ – as some can be interacted with, and some require the application to be loaded first.

  • Does the solution support an individual PIN/password? Is it required? Is it optional? Can it be mandatory, and delegated on-demand (i.e. to the smartphone’s OS-level PIN)?
  • Does the solution require any type of manual code input?
  • Does the solution support offline use? If so – can it offer visual (e.g. QR code based) transaction signing?
  • Does the solution offer backups? Can I continue using the product if I lose access to a hardware accessory?
  • Is the solution fast?

Administrator Experience

Administrators are a different type of user, but nevertheless, they are users and their experience matters. If the administrator experience is bad, then the solution might be incorrectly or partially rolled out and not maintained correctly. These factors can compromise security a lot.

  • Is the solution well documented?
  • Are common maintenance tasks automated?
  • Does the solution require the use of specific applications, to be administered? E.g. Internet Explorer?
  • Consider if the solution pays attention to details. E.g. smaller things like autocomplete, responsive user interface etc.

Privacy is Part of the Experience 

Are there privacy-oriented features?

Consider if the MFA solution supports advanced features, which could improve the privacy of its users. For example, does it support a geofence? If so, you could white-list approved locations for access, and eliminate the scenario, where your employees are accessing your applications from an insecure, public WiFi (e.g. the airport).

Alternatively, consider where the user information is stored. Is it on your servers? That makes you liable for keeping this data secure. You already have a lot on your plate, you don’t need to add more to it. Is the information stored with the user? Then consider if you are satisfied with how securely this information is stored, and if it matches your security requirements.

  • Consider if the product has advanced features that contribute to user privacy.
  • Can the product’s design and feature-set reduce your company’s liability?


What Next?

We hope that the mentioned points will help kickstart an internal discussion, and – hopefully – to a rational, and beneficial choice.

Security is important, so you need to consider your options carefully. However, don’t forget that any security is better than no security. If you can not decide on a solution to commit to long-term, try some free/cheap options to get a feel for it. In the short term, it will help you understand the capabilities, and constraints better, as well as provide security while you choose.

For more information, consult industry leading analysts, such as Gartner. Check out all the different vendors, their feature set and price lists, as well as consider additional points, which may be relevant to your situation: do you need an offline mode? What is your threat model? How reliable/highly available the solution needs to be?

Consider your needs, and weigh the pros/cons against them. It is easy to wish for a product that scores 100% in every category, but in reality – every product is a different combination of various tradeoffs.

Keep that in mind, and good luck!