When you travel from Oslo to Seattle you also go nine hours back in time. I had been warned of what they call “jetlag”. I have never been outside Europe and as the amateur I am I asked the receptionist to book a ticket with arrival date one day before the event. That was a big mistake! I should at least have booked a ticket with arrival date two or three days before the event. It took me three days to get in normal shape again. The jetlag was horrible but I survived by drinking a lot of strong black coffee and Coca Cola. And something I asked myself when traveling over the globe was why I not planned to stay a week for a vacation and some sightseeing. Maybe next time…
…on the other hand, it was no problem to get up in the morning the first days. I started the Monday morning with a 7 km long run with two Danish guys. In the beginning, we conversed in our own language but after just a minute or two we switched to English. Two countries so close to each other but so difficult to understand each other.
The following week was intense but rewarding. Three long days followed with presentations and two days of a hands-on hackathon.
Keynote by Steven Guggenheimer
The last 15 years the focus has been on processes, bandwidth and storage – three things we have a lot of today. Today more companies start the digital transformation and change the business models – products become services and costumers becomes fans and followers. When we connect things together the life becomes simpler but it also makes everything more complex. We need help to handle all the information. For Microsoft, the strategy is “cloud first”. Client-server comes next.
Cortana is a result of research and several building blocks for the future started 15 years ago, with speech recognition. We want a natural way to use cognitive services – a bot with personality that can be proactive and hold a conversation. Microsoft are not there yet but speech recognition and translation is a start. HoloLens is just a computer with sensors that you put on your head but give it a couple of years and we will see more of what is called a “mixed reality”.
The digital transformation goes faster and faster and it is hard to keep up. Microsoft list all interesting things and focus on what’s matter the next year.
Moving to the cloud
Telmo Sampaio and his colleges work between the engineers and the customers and help them move to the cloud. You must get to the cloud in some way but you need to do it in steps. Start with virtual machines and slowly move to services. At first choose Azure Active Directory. The same identity that is used on-premise is used in the cloud there the authentication happens with help of Azure AD Connect.
Haishi Bai talked more about why you should move to the cloud and what to expect. To get high availability in the cloud you should design for better redundancy instead of great hardware. You should expect more reboots and avoid single point of failure. Everything shall be decentralized and using several storage accounts is important Telmo explained. In a world, full of ransomware, you should also assume breaches and have a strategy there you do not need to pay Bitcoin for an captured, encrypted database.
About passwords in the cloud
Telmo thinks that the description of “password synchronization” in Azure AD Connect is very unfortunate because many are afraid to store their passwords in the cloud and it is worrying customers. In a hybrid solution, it is a hash of a hash of a password that is stored in the cloud.
Resource deployment in the cloud
For resource deployment in the cloud, you use building block templates. Microsoft provide a naming convention that the building blocks use out of the box. Parameters are stored in JSON formatted files. The template building blocks are public at GitHub. Templates can be deployed with Resource Manager REST API. A PowerShell script is always shipped with a building block but with .NET libraries, you can create and manage the resources in C#.
For Microsoft security is not a choice and they put a lot of effort into security. Brandon Koeller explained that Microsoft owning more of the stack today then earlier and because of that they need to make it as secure as possible, but they see that the attackers don’t go after the infrastructure any more – they go after the users! At Microsoft, they automate as much service work they can and the employees are abstracted from customer data. If an employee needs access to change something, he/she ask for permission in an internal app called “Lockbox”. All possible scenarios and risks are discussed. When approved, a temporary password is created with access for a limited time and the session will be logged and audited. Lockbox is not only used internally at Microsoft but also in rare cases when engineers are troubleshooting customer’s problems in Office 365.
Phishing is the number one risk
I visited Microsoft Cybercrime Center for a short but interesting tour. Microsoft are investing one billion dollars a year fighting botnets and malware together with other American companies.
Office 365 Secure Score
It can be hard to know what actions are the most valuable when configuring Office 365. Security admins love games and competition so therefore Microsoft created a series of PowerShell-scripts that gather configuration information that is then evaluated against various criteria. The output is recommendations how to remediate the security gap and a score for comparison with other admins.
Security advise for developers from Barry Dorrans
- Don’t use GitHub for reporting security issues! Microsoft has a bug bounty program that offers direct payments
- Don’t publish your signing key!
- Prevent Hash DoS by don’t using user input as keys in dictionaries unless the user input is a string or the HashCode for the input is strong and you implement a session key
- Prevent “Padding Oracles MS10-070” by don’t exposing padding oracles and adding an authenticated signature to the encrypted data, and validate it
- Prevent “Infinite Regex MS15-101” by setting timeouts on all regular expressions and finding a better way to validate than regular expressions
- Prevent URL abuse: When constructing URL:s from user input, validate the string
- Prevent displaying wrong characters by using string comparison instead of equal signs
- Don’t store Unicode characters in a non-Unicode database column! …And HTML encode data coming from SQL
- Prevent ZIP-bombs by turning of DTD Parsing
Slides and code examples
Developing features in the cloud
Scott Guthrie talked about the Azure strategy and explained that they want to provide a platform where developers can focus on the features. If you do something manually; like deploying threw the portal, you should step back and think how you should do it automatically. Donovan Brown presented one of the DevOps-tool they put together as an example of automation; Yo Team – build a team service with continues delivery and continues deployment pipeline in five minutes. As a user of Visual Studio Online on a hobby basis I can see how it becomes easier to set up support for an agile team.
- Azure Functions – event-based serverless compute experience
- Azure Logic Apps – Visualize business processes, integrate with SaaS and enterprise applications, out of the box connectors
Scott says that when developing in the cloud you should think of every project as a microservice and Microsoft’s goal is to support that model. Some options for implementing microservices on Azure is Service Fabric, Azure Container Service, Docker cloud, Docker on VM or App Service.
Haishi talked more about microservices and that it is important to build a mobilized workload. You should be able to handle errors with failover mechanisms in place. Data should be scattered and close to compute. Share nothing, think reactive and use explicit contracts between services. Make each service independently deployable.
Masashi Narumoto admits that you must be prepared before you start building microservices. You need domain knowledge, a skillset for distributed systems, a DevOps culture and have monitoring capability. If you have a very simple domain or no frequent update you lose some of the benefits with microservices and you probably not need it. Improving and upgrading tools surrounding a monolith can be a better way to make the system faster. He recommends Vaughn Vernons book Implementing Domain-Driven-Design to improve the skillset for developing microservices.
If we can predict the questions we can optimize the index and schemas. If a system should be able to figure out questions we use “schema on read” instead of a relational database where the schema is applied before the data is stored. Unstructured data can be stored in a Azure Blob Storage and the data schema is applied afterwards. Microsoft provides a set of tools called Azure Data Lake with Hadoop, Spark, U-SQL and HDInsight. These products are expensive and not recommended for set of data under 1 Terabyte. The Bing index, called “Bingdex”, is 5 Exabyte today and expects to be around 20 Zettabyte in 2020. You cannot move that kind of data so it must be pushed out to the nodes. For data under 1 Terabyte Microsoft provide more classic products like Azure SQL Data Warehouse with cubes, Excel and Power BI.
Michael Rys had a cool demo of the data processing language U-SQL that is a combination of T-SQL and C# that is very powerful.
With Microsoft Cognitive Services is it possible to know what a word means, recognize patterns and classify them in types like a human face. This kind of services can fix problems proactively before they start. By collecting data about weather pattern and soil, prediction about when to plant can be made. Real time engine health and sensors in an airplane can give a pilot feedback of how to optimize the landing.
If you are interested in cognitive bots, machine learning, deep learning or other advanced analytics in the cloud there is a lot to learn. You can start reading about languages like R and Python or products like Hadoop and Azure Data Fabric. Remember that playing around with a huge set of data can be expensive. A group at the hackathon burned 1800 NOK in a short time by fetching and compute Twitter messages. Not fun for the wallet, but they got a nice set of real life data to show in Power BI graphs.