If you are working with earlier versions of plain old ASP.NET, but need to upgrade your skills for Microsoft Azure DevOps, you may be looking to get up to speed on its smarter brother, ASP.NET Core, which is growing in popularity with the developer community.
While C# is the "most loved" programming language, according to a recent survey by developer tooling specialist JetBrains. There is growing regard for .NET Core, according to a recent Visual Studio Magazine article. "… the survey indicates Microsoft's new open source, cross-platform 'Core' direction is gaining traction but still has a long way to go as it usurps the ageing, Windows-only .NET Framework, with .NET Core and ASP.NET Core leading the migration," the article noted.
ASP.NET Core is "a complete rewrite that unites the previously separate ASP.NET MVC and ASP.NET Web API into a single programming model," according to a Wikipedia article. "ASP.NET Core applications support side by side versioning in which different applications, running on the same machine, can target different versions of ASP.NET Core. This is not possible with previous versions of ASP.NET."
Microsoft sought to distance the new framework from the older versions of ASP.NET. The company didn't want it to be thought of as simply an update, so the working title, ASP.NET 5 was changed to ASP.NET Core 1.0 for its 2016 release to highlight its status as a brand new product.
Microsoft has recently published an Introduction to ASP .NET Core that covers what developers need to know about the framework for building cloud-based, Internet-connected applications. It touts the framework's advantages including:
- Build web apps and services, IoT apps, and mobile backends.
- Use your favorite development tools on Windows, macOS, and Linux.
- Deploy to the cloud or on-premises.
- Run on .NET Core or .NET Framework.
Why ASP .NET Core?
Microsoft is aiming at ASP.NET 4.x developers, said to number in the millions, touting ASP .NET Core's ability to integrates seamlessly with popular client-side frameworks and libraries, including Blazor, Angular, React, and Bootstrap. The company also points out that the new framework provides benefits including:
Beyond explaining what the framework can do, the Microsoft tutorial page offers developers a step-by-step "learning path" with code samples for a basic app to show how it works.
Would you rather see how it's done rather than reading the instructions? There is a fun and informative one-hour YouTube video featuring Daniel Roth, program manager on Microsoft's ASP.NET team, covering "Full stack web development with ASP.NET Core 3.0 and Blazor." The highly entertaining host shows an enthusiastic live audience how to build a pizza store web app.
If videos are your thing, Microsoft's ASP .NET Community Standup site offers scheduled tutorials that you can watch live. All the past live events, several of them featuring the irrepressible Roth, are available for replay. Recent episodes also included the June ASP.NET Core 3.0 Preview 6 Release Party.
Hands-on Training for Building a Modern DevOps Pipeline
If you are looking for in-person hands-on training, ASP .NET Core will be in the spotlight on Sept. 29, at VS Live! San Diego. There will be a full-day hands-on lab: Building a Modern DevOps Pipeline on Microsoft Azure with ASP.NET Core and Azure DevOps.
Attendees will come away with an ASP.NET Core app and a SQL Server Database running in Azure with a full continuous integration / continuous deployment (CI/CD) pipeline managed by Azure DevOps.
The instructors will begin with a review of the current thinking on DevOps. Next will be the planning and tracking phase where the architecture of the app will be broken out and defined. Then the dev & test phase where attendees get feature flags implemented, CI builds working, manual and automated tests, and more. In the release phase, you will learn how to create a deployment pipeline to multiple environments and how to validate a deployment after its release using Azure App Services (both web apps and containers). Finally, the monitor and learn phase will cover analytics and user feedback and how you start the cycle over again.
By the end of the day, you will have a CI/CD pipeline configured, a deployed app, and the hands-on experience on how to build a modern ASP.NET Core and SQL Database solution that runs in Azure using Azure DevOps.
You can find out more and register here.
Posted by Richard Seeley on 07/19/20190 comments
At one time data was mostly a topic for database administrators. Not anymore. With artificial intelligence exploding on the scene, Big Data is a hot topic in the developer world.
The meteoric rise of the Internet of Things (IoT) is creating huge datasets – think terabytes and petabytes – that go beyond what traditional relational database management systems (RDBMS) and legacy software analytics tools can handle, according to a Wikipedia article. Sensory data is pouring in from IoT devices in industries including medical, manufacturing and transportation. That data is useful and sometimes even crucial but those industries need a way to make sense out of it.
A new generation of data analytics applications is needed to deal with what one analyst called the coming "datapocalypse." This presents a challenge and an opportunity for developers if they have the skills and tools to create those apps for business users.
On the tool front, .NET developers recently got good news with the preview of .NET for Apache Spark, which will allow them to more easily use the popular Big Data processing framework in C# and F# projects, according to an article by David Ramel, editor of Visual Studio Magazine.
"Spark is described as a unified analytics engine for large-scale data processing, compatible with Apache Hadoop data whether batched or streamed," the editor explained. "Currently, Spark is accessible via an interop layer with APIs for the Java, Python, Scala and R programming languages. While .NET coders have been able to use Spark with Mobius C# and F# language binding and extensions, the new project seeks to improve on that scheme while paving the way to add more language support."
In its announcement of .NET for Apache Spark, Microsoft said it "… provides high performance APIs for using Spark from C# and F#. With [these] .NET APIs, you can access all aspects of Apache Spark including Spark SQL, DataFrames, Streaming, MLLib etc. .NET for Apache Spark lets you reuse all the knowledge, skills, code, and libraries you already have as a .NET developer."
Microsoft’s .NET for Apache Spark website explains use cases:
- Large streams of data can be processed in real-time with Apache Spark, such as monitoring streams of sensor data or analyzing financial transactions to detect fraud.
- Apache Spark can reduce the cost and time involved in building machine learning models through distributed processing of data preparation and model training, in the same program.
- Modern business often requires analyzing large amounts of data in an exploratory manner. Apache Spark is well suited to the ad hoc nature of the required data processing.
As an open source project, Microsoft says that .NET, which is free, and now includes .NET for Apache Spark, requires no fees or licensing costs even for commercial projects.
There is a GitHub site with a tutorial for developers looking to get started with .NET for Apache Spark.
F# for Machine Learning
Speaking of F#, developers working with the open source, cross-platform language are getting new functionality for ML. The 15-year-old language currently works with Microsoft's ML.NET machine learning framework, but Microsoft says new ML functionality is in the works.
With the latest F# 4.6, Microsoft’s primary focus is on boosting performance for medium-to-large sized solutions, according to an article in Visual Studio Magazine.
"Other work included significant reductions in cache sizes, significant reductions in allocations when processing format strings, removing ambient processing of identifiers for suggestions when encountering a compile error, removing LOH allocations for F# symbols when they are finished being type-checked, and removing some unnecessary boxing of value types that are used in lots of IDE features," Microsoft said.
Updates to F# will now be synched with Visual Studio releases, according to a Microsoft announcement that concluded by telling developers: "With this in mind, you can think of the Visual Studio 2019 release and future updates as a continuous evolution of F# tooling."
ML for the Masses with Azure Update
Developers aren’t going to have all the fun in the AI revolution. The Azure Machine Learning Web UI is being updated for business power users who do not have programming skills, according to Microsoft.
"Emphasizing our mission to scale machine learning to the masses, we now introduce automated machine learning user interface (UI), which enables business domain experts to train ML models without requiring expertise in coding," said Tzvi Keisar, senior program manager, Microsoft Azure.
Find out more in this Visual Studio Magazine article.
Microsoft's 3 AI Dev Approaches
The Azure Machine Learning UI, is part of Microsoft’s three pronged approach to AI development, summed up as “Code First, No Code and Drag-and-Drop,” according to a recent Visual Studio Magazine article.
This three-prong approach was outlined by Bharat Sandhu, director of artificial intelligence at Microsoft, to fit different classifications of developers, or "AI authoring models:"
- Code first: use any tools
- No code: use automated machine learning
- Drag and drop: make models visually
Explaining the three approaches for different types of AI authors, Microsoft sees:
- Developers and data scientists who want to write code to build machine learning models. They will take the code first model Azure Machine Learning offers.
- Business domain experts, who know data, but don't know much about machine learning or code will use Azure Machine Learning's automated machine learning 'no code' option.
- IT professionals and experts in statistics or mathematics, who are not coders but want to make their own models, will use a drag-and-drop approach.
So Microsoft is planning a big tent approach to AI development to accommodate as many developers and power users as possible.
AI, Data and ML at VSLive! at Microsoft Headquarters
If you are a developer, who wants to up your skills, AI, Big Data and Machine Learning will be a hot topic at VS Live! Microsoft HQ in Redmond WA, Aug. 12 – 16. Sessions will cover:
- AI and analytics with Apache Spark and Azure Databricks
- Deep learning for developers
- Data pipelines and analytics on Azure
- SQL Server 2019 deep dive
- Azure Cosmos DB
- Power BI
Find out more and sign up here.
Posted by Richard Seeley on 06/19/20190 comments
Everybody involved with any kind of software development has heard the DevOps mantra: "DevOps automates and speeds software delivery." Of course, not everyone practices DevOps because there’s a learning curve and a discipline to follow and that’s not always easy especially in organizations demanding instant application gratification. Despite the hype, getting started in DevOps can be a challenge.
So beyond marketing mantras, what positive things can DevOps do for you?
"DevOps is a practice that unifies people, process, and technology across development and IT in five core practices: planning and tracking, development, build and test, delivery, and monitoring and operations," Microsoft explains in an overview of Azure DevOps. "When practicing DevOps, development, IT operations, quality engineering, and security teams work closely together—breaking down practices that were once siloed. Improved coordination and collaboration across these disciplines reduces the time between when a change is committed to a system and when the change is placed into production. And, it ensures that standards for security and reliability are met as part of the process. The result: better products, delivered faster ..."
If you get the time and training to implement it, DevOps could help satisfy demands for instant application gratification.
There are lots of tools to help developers but for someone new to DevOps, it’s not always clear what does what.
In the Microsoft cloud world, Azure DevOps server and services is designed to smooth the developer’s way into the paradigm.
"Azure offers an end-to-end, automated solution for DevOps that includes integrated security and monitoring," according to Microsoft. "The developer experience of Azure DevOps integrates with the tools of your choice. If you’re a Java developer—great—Azure provides native integrations with Eclipse. If you build with Jenkins, use it to easily deploy directly to Azure. Bring your development, IT operations, and quality engineering teams together to build, test, deploy, monitor and manage applications in the cloud."
Deciphering Microsoft Names
The products sound good but the product naming can get a little confusing. Things have evolved. First of all there is Azure DevOps Services, a subscription-based Azure platform, announced in September 2018, that's hosted from Microsoft's datacenters. Then this March, the Redmond, WA-based company announced Azure DevOps Server 2019, which is designed for organizations that want to deploy it in their own infrastructures or any datacenter. One more clarification is needed because Azure DevOps Server 2019 is replacing Team Foundation Server (TFS). Microsoft says that "Azure DevOps represents the evolution of Visual Studio Team Services (VSTS)" and suggests users of Team Foundation Server 2012 and newer versions upgrade to Azure DevOps Server 2019.
Now, that that’s cleared up, what do these versions of the platform provide for developers wanting to apply the DevOps model to their Azure-based products?
In a blog Introducing Azure DevOps Service, Jamie Cool, Director of Program Management, Azure DevOps, said: "Working with our customers and developers around the world it’s clear DevOps has become increasingly critical to a team’s success. Azure DevOps captures over 15 years of investment and learnings in providing tools to support software development teams."
In announcing the Azure DevOps Server, Cool said it "includes developer collaboration tools which can be used together or independently, including Azure Boards (Work), Azure Repos (Code), Azure Pipelines (Build and Release), Azure Test Plans (Test), and Azure Artifacts (Packages). These tools support all popular programming languages, any platform (including macOS, Linux, and Windows) or cloud, as well as on-premises environments."
As with TFS, developers control where they install Azure DevOps Server and when they apply updates.
Cool says there are major updates from TFS 2018 to Azure DevOps Server 2019, and his key highlights include:
- The new navigation, which enables users to easily navigate between services, is more responsive and provides more space to focus on your work.
- Azure Pipelines has been enhanced in many ways including new Build and Release pages, and support for YAML builds.
And for those who would rather let Microsoft manage the maintenance and updates, Cool points out that Azure DevOps Services is available.
"Each Azure DevOps service is open and extensible," according to Microsoft. "You can use them together for a full DevOps solution or with other services. If you want to use Azure Pipelines to build and test a Node service from a repo in GitHub and deploy it to a container in AWS, go for it. Azure DevOps supports both public and private cloud configurations. Run them in our cloud or in your own data center."
Azure and AI
As part of its big Build developer conference earlier this month, Microsoft announced new development features for Azure related to artificial intelligence development with machine learning.
As reported in a Visual Studio Magazine article, the news includes MLOps (DevOps for machine learning) capabilities with Azure DevOps integration. MLOps is designed to provide developers with reproducibility, auditability and automation of the end-to-end machine learning lifecycle.
"We're delivering key new innovations in Azure Machine Learning that simplify the process of building, training and deployment of machine learning models at scale," exec Scott Guthrie announced in a blog post before the conference. "These include new automated machine learning advancements and an intuitive UI that make developing high-quality models easier, a new visual machine learning interface that provides a zero-code model creation and deployment experience using drag-and-drop capabilities and new machine learning notebooks for a rich, code-first development experience."
DevOps in the Spotlight
The spotlight will be on DevOps at VSLive! in Boston, June 9 – 13. It will include a Full Day Hands-On Lab: Building a Modern DevOps Pipeline on Microsoft Azure with ASP.NET Core and Azure DevOps.
"By the end of the day you'll have your own Azure DevOps organization with a CI/CD pipeline configured, a deployed app, and the hands-on experience on how to build a modern ASP.NET Core and SQL Database solution that runs in Azure using Azure DevOps," the lab description promises.
Find out about all the learning opportunities at vslive! in boston.
Posted by Richard Seeley on 05/15/20190 comments
DevOps can be a challenge especially for developers starting out with it. But a new Basic Process in Azure DevOps aims to pare down the more complicated methodologies.
On an April 2019 Microsoft video explaining how Basic Process works, Dan Hellem, program manager on the Azure DevOps team, told viewers that the goal is to not only attract engineers to the product but also keep them as loyal users: "We’ve been talking about putting out the Basic Process for several years. Looking at it from the perspective of a new person coming into Azure DevOps, we have three processes: Agile, Scrum and CMMI [Capability Maturity Model Integration]. The problem with those processes is they are very methodology heavy."
In a February 2019 blog introducing Basic Process, Hellem noted that Agile is the most popular of the three processes but even it can be difficult for newcomers.
"The Agile process still brings a set of concepts and behaviors that are not obvious to our new users and therefore some of those users have a hard time understanding Azure Boards (Basic Process)," he wrote. "For example, the four-level backlog hierarchy or the many state transition rules. These add complexities that new users don’t care about. New users come from tools such as GitHub with very simple work tracking, and they want Azure Boards to be just as easy."
In the video, he imagines how Azure DevOps may seem to the first-time user: "So if I’m a new engineer and I’m working with GitHub stuff and I come over and want to use Azure DevOps, are you really going to make me work with product backlog items and user stories and all those weird those weird terminologies? What we found is that when engineers start using the product, they start dropping off because it is just too complicated. We want engineers to use the product, so what we did is get rid of the baggage like Agile and Scrum and CMMI. We wanted to make it easier to work with so engineers can start working with it and start getting things done."
Keeping it Simple
To achieve their goal the Azure DevOps team reduced the scope of the process to get down to the basics, Hellem explained in his blog. "To start we reduced the number of work item types down to three: Epic, Issue, and Task. By default, users can start right away by adding Issues to their board. You will notice that the board contains 3 columns. To Do, Doing, and Done. This simplified state model is used for all three work item types."
Work item types in Agile, Scrum and CMMI contain many extra fields that are not needed for someone starting out, Hellem explained. "Our research shows that many users get confused by all of the extra fields and their purpose. In the Basic Process, we kept only the core fields and removed the rest. Only fields that are required to support other functionality survived."
Hierarchy is the last area the Azure DevOps team simplified in the Basic Process. "Instead of four levels, Basic starts with just two," Hellem explained. "Users will start with Issues, and those Issues can be broken down into Tasks. For more advanced scenarios, issues may not be enough. Some users may want a way to group their issues into specific deliverables. For these users we are providing the Epic work item type."
Beyond his blog, Hellem encourages developers to read the documentation published in January 2019, Start using Azure Boards (Basic Process). Talk about simple, the documentation only takes about two minutes to read.
Other Azure DevOps Improvements
It’s been a busy few months for Azure DevOps.
In March, Microsoft announced the commercial release of the new Azure DevOps Server 2019.
"DevOps Server 2019, used for developer collaboration, is the company's rebranded successor to Team Foundation Server 2018," explained Kurt Mackie, in an Application Development Trends (ADT.mag) article on the announcement.
"Azure DevOps Server 2019 is notable for having a redesigned user interface that follows Microsoft's Fluent design concepts," Mackie explained. "Developers get access to various services, such as Azure Pipelines for continuous integration/continuous development across different languages and platforms. It has an Azure Artifacts service for package feeds and project tracking via the Azure Boards service. Testing is supported by the Azure Test Plans component. The server also works with the Azure Repos service to integrate with Git repos."
DevOps Specialization Pays
If you think DevOps isn’t that interesting, think again. For DevOps specialists, job satisfaction is high and so is the pay.
A survey of 88,000 developers around the world, released April 9 by Stack Overflow, found that people who know DevOps are well paid and happy in their work.
In an article on the survey, David Ramel, editor of Visual Studio magazine, wrote: "Culling through that data finds one main takeaway about the DevOps movement that has been gaining steam by providing automated processes that bridge software development and IT teams to improve the build/test/release cycle:
DevOps specialists and site reliability engineers are among the highest paid, most experienced developers most satisfied with their jobs …
If you want to check out all the DevOps survey results, it’s available here.
DevOps in the Spotlight
Want to get on the DevOps bandwagon? One of the highlights at the upcoming VSLive! in Boston, happening June 9 through 13, will be an special track on DevOps.
See what the track description offers.
DevOps in the Spotlight: You have a role to play when it comes to DevOps, and in this track, you'll learn the about the tools, techniques, and concepts that you can immediately apply to your daily work.
You'll find coverage of:
- Azure DevOps Services in the cloud
- Azure DevOps Server on premises (formally named Team Foundation Server)
- Writing maintainable test automation
- Architecting solutions for DevOps and continuous delivery
- Getting started with git
- Azure Secure DevOps
- Database DevOps
Find out more here.
Posted by Richard Seeley on 04/17/20190 comments
Service-oriented architecture (SOA) and web services were hot developer trends a decade ago but may only get a yawn today. Now, microservices take the concept into the cloud and container era.
If you missed the SOA and web services hoopla, and haven’t worked with microservices here’s a brief explanation from Wikipedia:
“Microservices are a software development technique—a variant of the service-oriented architecture (SOA) architectural style that structures an application as a collection of loosely coupled services.”
Microservices and containers are growing in popularity because the modular approach is better suited to today’s dynamic business needs than older programming methodologies with zillions of lines of code.
As Michael Otey explained in an in-depth Redmond Magazine article: “Traditional monolithic applications consist of large executable programs that are complex and can be difficult to deploy and update. In contrast, container-based microservice applications are composed of many small independent services running in containers that enable the organization to deploy and update granular microservices individually without impacting the entire application. This microservice architecture potentially makes these applications more resilient, as well as easier to deploy and update.”
If you want to take a deeper dive into microservices as well as containers, there are resources available to help you get up to speed.
Visual Studio users working with .NET can download a free 340-page eBook from Microsoft, .NET Microservices: Architecture for Containerized .NET Applications. Reading and digesting 340 pages of material is going to be a challenge. The authors offer some Key Takeaways to provide Microsoft’s overview of this way of doing application development.
Making the case for microservices architecture, the authors assert that it is “becoming the preferred approach for distributed and large or complex mission-critical applications based on many independent subsystems in the form of autonomous services. In a microservice-based architecture, the application is built as a collection of services that are developed, tested, versioned, deployed, and scaled independently. Each service can include any related autonomous database.”
For developers who have been working with SOA, this Lego approach to building an application from existing pieces of software may be old hat. One new wrinkle since SOA emerged in 1998, is containers, specifically the Docker technology that was first introduced in 2013. For cloud applications, microservices and containers go together, although containers also have wider utility in the .NET world, as the authors of the Microsoft eBook explain: “Containers are convenient for microservices, but can also be useful for monolithic applications based on the traditional .NET Framework, when using Windows Containers. The benefits of using Docker, such as solving many deployment-to-production issues and providing state-of-the-art Dev and Test environments, apply to many different types of applications.”
As one of the major contributors to the open source container project, Microsoft is pretty much all in with Docker when it comes to microservices architecture. “Docker-based containers are becoming the de facto standard in the industry, supported by key vendors in the Windows and Linux ecosystems, such as Microsoft, Amazon AWS, Google, and IBM. Docker will probably soon be ubiquitous in both the cloud and on-premises datacenters,” the eBook authors write.
But you don’t have to take Microsoft’s word for it. On its website, Docker is touting Forrester Research Inc.’s latest take on the technology, New Wave Enterprise Container Platform, Q4 2018 Report. Docker “leads the pack with a robust container platform well-suited for the enterprise,” the container company says.
“The purpose of Docker is to build containers that hold, potentially, all of the components of an application: the application itself, the database engine, any Web services it requires and so on. That container, unlike a virtual machine, doesn't require an operating system so it takes less space than a VM and starts up/shuts down faster,” explained Peter Vogel, in a recent Visual Studio Magazine article, Understanding Docker Vocabulary.
For those starting out with using containers for their applications, Vogel said there’s good news. “There are a bunch of prepared containers waiting for you to use on Docker Hub: these are called images. Many of them are Linux based, but for .NET Core applications that's not an issue: Core runs as well on Linux as Windows.”
While Docker is given credit for popularizing containers, Vogel notes that when it comes to container orchestration the “elephant in this living room is Kubernetes.”
Kubernetes, first released in 2015, is an open source container orchestration system developed by Google but also championed by Microsoft.
In his Redmond article on containers referenced above, Otey explained: “Kubernetes provides automated deployment, scaling and operations for containers. It provides a management control plane for containers that works above the container level … Essentially, Kubernetes is used for managing distributed application containers across clusters of physical or virtual machines. It supports a range of different container tools including support for Docker. Kubernetes allows you to combine multiple containers that make up an application into logical groups for easier management. It's designed to enable you to easily deploy, update and scale your applications, as well as optionally limiting hardware utilization.”
The article explained that Kubernetes is the Greek word for helmsman or pilot. For non-Greek speakers who have trouble pronouncing the name, Otey said “Kubernetes is sometimes referred to as K8S, where the eight letters of ubernete are replaced by the number 8.
Visual Studio Live! New Orleans: Cloud, Containers and Microservices
Reading documentation on application development approaches is good but it is hard to beat in-person learning from experts who really know their stuff. Visual Studio Live! in New Orleans, April 22 to 26, is offering a track covering Cloud, Containers and Microservices including:
- Microservice architectures
- Containers 101
- Azure Kubernetes Services
- Azure Active Directory B2C
- Serverless and Azure Functions
- Intelligent apps in the cloud with Azure Cognitive Services
Find out more.
Posted by Richard Seeley on 03/20/20190 comments
"I will talk about two sets of things. One is how productivity and collaboration are reinventing the nature of work, and how this will be very important for the global economy. And two, data. In other words, the profound impact of digital technology that stems from data and the data feedback loop." ~ Microsoft CEO Satya Nadella
Collaboration, productivity and data are what Azure Databricks is all about.
Data, of course, is everything. How many F150 trucks did Ford build this year? What is the patient’s heart rate and blood pressure? Where can we find sushi at this time of night?
The iPhone in your back pocket is filled with data and searching for more.
How does that data get organized so you can find it when you need it?
Machines do a lot of that work. But people working in collaboration with other people and other machines make the data driven world go around.
Azure Databricks is a collaboration between Microsoft and the creators of Apache Spark, which is described on its homepage as an "analytics engine for big data processing, with built-in modules for streaming, SQL, machine learning and graph processing."
Databricks, the company created to commercialize Spark, "provides a Unified Analytics Platform for data science teams to collaborate with data engineering and lines of business to build data products," the company states. "Users achieve faster time-to-value with Databricks by creating analytic workflows that go from ETL and interactive exploration to production. The company also makes it easier for its users to focus on their data by providing a fully managed, scalable, and secure cloud infrastructure that reduces operational complexity and total cost of ownership."
One Click Setup and Management
In announcing the collaboration with Databricks in 2017, Microsoft touted Azure Databricks as "a fast, easy and collaborative Apache Spark-based analytics platform that delivers one-click setup, streamlined workflows and an interactive workspace. Native integration with Azure SQL Data Warehouse, Azure Storage, Azure Cosmos DB, Azure Active Directory and Power BI simplifies the creation of modern data warehouses that enable organizations to provide self-service analytics and machine learning over all data with enterprise-grade performance and governance."
In a Microsoft overview of Azure Databricks, the company explains its value-add: "Azure Databricks features optimized connectors to Azure storage platforms (e.g. Data Lake and Blob Storage) for the fastest possible data access, and one-click management directly from the Azure console. This is the first time that an Apache Spark platform provider has partnered closely with a cloud provider to optimize data analytics workloads from the ground up."
What Does All This Mean?
So here’s this extensive set of data tools, what can you build with them? In January, Databricks provided answers from data industry thought leaders, who focused on the need for solutions to issues organizations face with AI, Big Data and Analytics.
Kamelia Aryafar, chief algorithm officer at Overstock, sees deep learning, which is a class of machine learning algorithms facilitated by the Spark technology, paying dividends for organizations. "Deep learning innovations will create a lot of new AI applications, some of which are already in production and making massive changes in the industry," she is quoted as saying. She noted that Overstock is currently using deep learning to improve marketing projects such as email campaigns.
Other thought leaders quoted by Databricks see the need for the latest data tools to be used to improve long-standing issues including data processing and providing trusted data with "Explainable AI."
Because of the social, economic and commercial implications of the data being generated, "it is critical to develop AI that is explainable, provable and transparent," said Mainak Mazumdar, chief research officer at Nielsen.
Databricks CEO and co-founder Ali Ghodsi finds data processing to still be a challenge in the AI era. "As an industry we tend to believe that data scientists are spending the majority of their time developing models, shares. Truth be told, data processing remains the hardest and most time consuming part of any AI initiative. The highly iterative nature of AI forces data teams to switch between data processing tools and machine learning tools. For organizations to succeed at AI in 2019, they have to leverage a platform that unifies these disparate tools."
Reading between the lines, Databricks provides the platform companies need to leverage.
Databricks concluded the survey on the near future of AI and Big Data, stating: “Solving the world’s toughest data problems starts with bringing all of the data teams together within an organization. Data science and engineering teams’ ability to innovate faster has historically been hindered by poor data quality, complex machine learning tool environments, and limited talent pools. Additionally, organizational separation creates friction and slows projects down, becoming an impediment to the highly iterative nature of AI projects. Much like in 2018, organizations that leverage Unified Analytics will have a competitive advantage with the ability to build data pipelines across various siloed data storage systems and to prepare labelled datasets for model building, which allows organizations to do AI on their existing data and iteratively do AI on massive data sets.”
Practical Use Cases
Healthcare is an area where AI can be used to parse patient data to provide diagnostic and other assistance to medical professionals.
Last June, Databricks announced that it has been working with pharmaceutical and healthcare providers "to improve their drug discovery processes."
"One such customer, the Regeneron Genetics Center (a wholly-owned subsidiary of Regeneron, a leading biotechnology company), has sequenced over 300,000 consented volunteers and paired their de-identified genetic data with de-identified electronic health records to uncover actionable insights for drug discovery and development,” Databricks said in the announcement.
Jeffrey Reid, PhD, Head of Genome Informatics at Regeneron, was quoted as saying: "As this dataset has grown rapidly, we encountered significant barriers in simple tasks, like gathering all of the data for a given analysis, and querying the 10s of billions of results from our studies. Not only has the Databricks Unified Analytics Platform solved these big data problems, but it is enabling everyone in our integrated drug development process – from physician-scientists to computational biologists – to easily access, analyze, and extract insights from all of our data. Drug development is still a long and difficult process rife with failure, but we have already significantly reduced the amount of time it takes to generate important early insights."
Databricks cited the following areas where its platform enables medical researchers to:
- Accelerate discovery with simplified genomic pipelines: Simplify workflows with prebuilt genomic pipelines hosted in the cloud to process large datasets up to 100x faster than existing solutions.
- Innovate faster with interactive, tertiary analytics and AI at scale: Quickly and simply run tertiary analytics and machine learning algorithms on massive genomic datasets with prepackaged frameworks designed to run in parallel.
- Improve productivity across data, analytics and research teams: Create a collaborative environment and shared workspaces for bioinformaticians, computational biologists and researchers to work together across the research lifecycle with shared workspaces, saving teams precious time and resources.
Training for AI, Data, and Machine Learning
Working with AI, Big Data and Machine Learning is the future of application development. If you want to build Azure Databricks skills, Visual Studio Live! New Orleans this April offers a session on AI and Analytics with Apache Spark on Azure Databricks where you will learn:
- About the fundamentals of Apache Spark, Spark SQL and Spark MLlib
- How to use Databricks notebooks
- How to manage clusters and jobs
- How to integrate Azure Databricks with blob storage and Azure Data Lake Store (ADLS)
- How to write Python code for both analytics and machine learning
Find out more here.
Posted by Richard Seeley on 02/20/20190 comments
Xamarin is one of the biggest tech success stories of the decade.
In 2011 few if any developers had even heard of Xamarin. That was the year its namesake San Francisco-based company was founded. Six years later, “1.4 million developers were using Xamarin's products in 120 countries around the world,” according to Wikipedia. Is it any wonder that Microsoft acquired it in 2016 and made it a free tool for Visual Studio developers?
So now Xamarin is in the big time, touting itself in Wikipedia as “the only IDE that allows for native Android, iOS and Windows app development within Microsoft Visual Studio.” Plus, it’s become its own standalone IDE, Xamarin Studio for Windows and Mac mobile app development. There also:
- Xamarin Test Cloud
- Xamarin .NET Mobility Scanner
- Xamarin RoboVM for Java
Then Came Flutter
As 2019 dawned, Visual Studio Magazine popped the question: Will Flutter Become a Xamarin Option?
“Flutter for Xamarin” is something of a work in progress, explains David Ramel, editor of Visual Studio Magazine. It is an under-construction project championed by some developers on GitHub. And there is some debate as to whether Flutter and Xamarin is a match made in app dev heaven. Reading the article the Flutter project is somewhere between a potential breakthrough for developers and a whimsical hobby for coders with time on their hands. The latter view comes from Adam Pedly, the leader of the Flutter for Xamarin project, who describes it on GitHub this way:
"This project is never expected to be commercially viable, unless it is picked up or supported by a larger company. As it currently stands, this is just a fun side project, done by a bunch of developers in their spare time. We offer no support for solutions ever built with this framework, or any guarantee of completion."
There are developers who question why this project even exists. However, if we are talking about support from “a larger company,” it’s important to remember that Flutter is Google's mobile UI framework and you don’t get much larger than Google.
In the article, Ramel found some serious interest in the potential value of Xamarin + Flutter. One developer offered some reasons why this match might work:
- Because you still want to work in C# but want the ease of use and performance of Flutter?
- Because you have existing .NET logic code shared with other projects that you want to re-use?
- Because you still want to take advantage of the vast array of .NET assemblies and nuggets at your disposal?
- Because the maintainer wanted a hobby project to see what was possible?
There is no telling where this will go but in the comments section for the Visual Studio Magazine article, it did spark debate with comments ranging from: “I honestly don't see the point of this. Xamarin is already an alternative to Flutter. Why would one combine React and Angular?” to “Yes I know that [it’s an] unofficial project, it is very interesting & promising …”
So, stay tuned.
Enter Xamarin.Forms 4.0
Another recent Xamarin news flash is about the big changes Microsoft is unveiling in Xamarin.Forms 4.0.
Having just released Xamarin.Forms 3.4.0, Microsoft took the unusual step of simultaneously releasing a preview of Xamarin.Forms 4.0. You can read all about it in this Visual Studio Magazine article.
Microsoft’s here’s-a-new-release-and-here’s-a-preview-of-the-next-release strategy seems to be about keeping up with developer demands.
In a blog announcing the early preview, Microsoft's David Ortinau wrote: "Through countless interviews, conversations, and surveys, we have heard your voice loud and clear. You want Xamarin.Forms to be easier to use 'out of the box', navigation to be ever present and easier to control, to have a more consistent design across iOS and Android, and to have a faster, more flexible list control."
You can read the entire blog here.
Xamarin Does Las Vegas
If you are interested in getting up close and personal with Xamarin, the place to be is Bally's Hotel & Casino in Las Vegas March 3 ‐ 8 for Visual Studio Live!
There is a “Full Day Hands‐On Lab: Xamarin and Azure: Build the Mobile Apps of Tomorrow” featuring Brandon Minnick, who works “helping fellow Xamarin developers build 5-star apps,” along with Laurent Bugnion, Microsoft Most Valuable Professional (Client Dev) and a Microsoft Regional Director.
If you want to learn more about working with Xamarin.Forms, there’s “Xamarin.Forms Takes You Places!” lead by Microsoft MVP Sam Basu.
Beyond Xamarin, there’s 6 days of in-depth training on the Microsoft platform, including hot topics .NET, Visual Studio, DevOps, SQL Server and more.
Find out more about Visual Studio Live! in Las Vegas.
Posted by Richard Seeley on 01/17/20190 comments
Whether you’ve dabbled in AngularJS or even worked extensively with it, you want to upgrade your skills to move to the latest versions of Angular because that’s where the jobs are.
In the past two years, the Angular team at Google released new versions so we have Angular 2/4/5, and in May of this year, Angular version 6.0.0 appeared. (In case you’re wondering, Angular 3 got lost in the development shuffle and like Windows 9 it basically doesn’t exist.) These TypeScript versions are sometimes lumped together as “Angular 2+” according to a Wikipedia article.
But you can think of “Angular v2 and above” as “plain ol' Angular,” says Ted Neward, who is director of Developer Relations at Smartsheet.com and well-known as a presenter at Visual Studio Live! His interview, Angular Q&A: Components, Getting-Started Tips (and that 'Total Rewrite' Thing) in Visual Studio Magazine is a great place to start learning about all things Angular.
Beyond that if you want to get up to speed on plain ol’ Angular, Neward recommended the resources available on the official Angular website.
Angular IDEs and Tools
The Angular website provides links to a number of Integrated Development Environments (IDEs) designed to make coding easier with features like drag and drop.
First of the list is Amexio API v5.2.1 with Angular 6 support, and D3 Charts for visualization. The product from MetaMAGIC Global Inc., based in India and New Jersey, provides an Angular UI automation platform including:
- Angular 6
- 130+ UI Widgets
- Drag & Drop Widget
- Responsive Web Design
- 57 Material Design Themes
- D3 Charts / Maps / Dashboard / Layouts
Texas-based Genuitec, a founding member of the Eclipse Foundation, is the developer of a commercial Eclipse tool, Angular IDE by Webclipse, which the company markets as “Simple for beginners; powerful for experts” providing:
- TypeScript 3.x validation and debugging
- Advanced coding of HTML templates with validation and auto-complete
- Integrated usage of the angular-cli for Angular best practices
Among the tools listed on the Angular site are:
- Angular CLI, a command line interface for Angular
- Angular Universal, a server-side rendering for Angular applications, which is on GitHub
- Angular Augury, a “developer tool extension for debugging and profiling Angular applications inside the Google Chrome and Mozilla Firefox browsers”
- Celerio Angular Quickstart, which GitHub lists as a simple way to generate an Angular 5 create, read, update, and delete (CRUD) application from an existing database schema
- Codelyzer, with a GitHub link to code for static analysis for Angular projects
- Compodoc for documenting your Angular project touts its capability to “Generate your Angular project documentation in seconds.”
If you are creating on-line storefronts with Angular, AngularCommerce provides a framework for building e-commerce applications with Google Firebase. The Angular site says AngularCommerce provides a set of components that is design agnostic and allows developers to easily extend functionality.
Quick Start for TypeScript
If you are new to TypeScript, you may want to start at the Typescript open source group’s website. It offers Quick Start with a five-minute tutorial as well as other documentation, downloads and a “Playground” with code samples.
Whether you are a TypeScript veteran or have just learned to work with it, the aforementioned Angular site has lots of helpful stuff starting with an Angular CheatSheet that Neward recommends bookmarking. It has code samples, starting with how to bootstrap the Angular platform and then moves on to:
- Template syntax
- Built-in directives
- Class decorators
- Directive configuration
- Component configuration
- Class field decorators for directives and components
- Directive and component change detection and lifecycle hooks
- Dependency injection configuration
- Routing and navigation
There are Angular how-to books listed on Amazon that you might want to browse. However, you could spend your time rather than your money on the Angular site and probably learn a lot.
Beyond Angular with Microsoft Blazor
Beyond developing in Angular, you might want to check out Microsoft’s Blazor-based .NET technology, which stole the show at a recent VSLive! keynote by Scott Hunter, Partner Director Program Management, .NET, at Microsoft.
Find out more about the futuristic Blazor technology in this article based on the keynote by David Ramel, editor of Visual Studio Magazine.
Posted by Richard Seeley on 10/16/20180 comments
Now you know.
Okay, glad we got that cleared up.
As with any relationship between brothers, there is sibling rivalry.
Imitation is the sincerest form of flattery.
In June, up-and-coming TypeScript broke into the Top 100 in programming language rankings, according to this Visual Studio Magazine article, which quotes the TIOBE Index popularity report: "This month TypeScript debuts at position 93 in the TIOBE index top 100 … The Microsoft language has been tracked for a couple of years now, but although its popularity in industry seems high, it never made it to the top 100. So finally it has got sufficient traction to be noticed.”
Little brothers are so competitive.
Posted by Richard Seeley on 09/13/20180 comments
One of the cool features in ASP.NET Core is support for the Dependency Injection software design pattern.
For developers working in Object-oriented programming and web services, dependency injection provides what Microsoft defines in documentation as valuable best practices to help:
- Design services to use dependency injection to obtain their dependencies.
- Avoid stateful, static method calls (a practice known as static cling).
- Avoid direct instantiation of dependent classes within services. Direct instantiation couples the code to a particular implementation.
If you are not familiar with the concept Dependency Injection, Wikipedia provides this basic definition:
“… dependency injection is a technique whereby one object (or static method) supplies the dependencies of another object. A dependency is an object that can be used [as] (a service). An injection is the passing of a dependency to a dependent object (a client) that would use it. The service is made part of the client's state. Passing the service to the client, rather than allowing a client to build or find the service, is the fundamental requirement of the pattern.
In Object-oriented programming, web services and Software Oriented Architecture (SOA), the goal is to use objects as building blocks of an application in a way that allows you to make changes by adding and removing objects rather than writing extensive code. As Wikipedia explains: “The intent behind dependency injection is to decouple objects to the extent that no client code has to be changed simply because an object it depends on needs to be changed to a different one.”
In Microsoft pages outlining the fundamentals of dependency injection, it offers a simple definition: “A dependency is any object that another object requires.”
Diving into the Documentation
For developers who specifically want to start using dependency injection with ASP.NET Core, Microsoft offers step-by-step documentation with code samples and simple demonstration apps.
On Microsoft’s ASP.NET Blog, Jeffrey T. Fritz, Microsoft .NET Program Manager explains the basics of documentationDependency Injection in ASP.NET Core, noting: “With ASP.NET Core, dependency injection is a fundamental tenet of the framework. All classes instantiated by the framework are done so through the container service that is maintained by the framework in a container and configured by default in the Startup/ConfigureServices method.”
Fritz reassures developers that the framework makes adopting dependency injection very straightforward: “ASP.NET Core makes it easy to get started with this design pattern by shipping a container that you can use with your application. Configure your application’s controllers, views, and other classes that are instantiated by the framework with parameters on the constructor method to have those types automatically created and passed in to your class.”
The blog contains code samples to illustrate how the design pattern works and points developers to Microsoft ASP.NET Dependency Injection documentation. The authors provide loads of code samples and offer the following recommendations to developers working with dependency injection:
- Avoid storing data and configuration directly in the service container. For example, a user's shopping cart shouldn't typically be added to the service container. Configuration should use the options pattern. Similarly, avoid "data holder" objects that only exist to allow access to some other object. It's better to request the actual item via dependency injection, if possible.
- Avoid static access to services
- Avoid using the service locator pattern (for example, IServiceProvider.GetService).
- Avoid static access to HttpContext (for example, IHttpContextAccessor.HttpContext).
That is just a sample of the detailed information available on that site.
Beyond the Basics
Once you have the basics down, Steve Smith, Microsoft MVP since 2002 and a founding member of the ASPInsiders, an external advisory group for the ASP.NET product team, provides insight on the MSDN site for Writing Clean Code in ASP.NET Core with Dependency Injection.
He notes that dependency injection is “an increasingly common technique in .NET development, because of the decoupling it affords to applications that employ it.”
“ASP.NET Core not only supports DI,” Smith writes, “it also includes a DI container—also referred to as an Inversion of Control (IoC) container or a services container. Every ASP.NET Core app configures its dependencies using this container in the Startup class’s ConfigureServices method. This container provides the basic support required, but it can be replaced with a custom implementation if desired. What’s more, EF Core also has built-in support for DI, so configuring it within an ASP.NET Core application is as simple as calling an extension method.”
More about ASP.NET Core
If you are looking for an enthusiastic endorsement and overview of ASP.NET Core, don’t miss Philip Japikse's Q&A on Hands-On with ASP.NET Core and EF Core in Visual Studio Magazine.
“I think ASP.NET Core is the biggest game changer in the history of Web development using the Microsoft stack,” asserts Japikse, Developer, Coach, Author, Teacher, Microsoft MVP and Visual Studio Live! presenter.
“Microsoft developers are no longer restricted to running their applications on Windows Server, but can essentially run anywhere,” he explains. “This brings in a myriad of options for deployment targets, including popular containers (like Docker) and lower cost (than Windows) Linux distros. This also opens up .NET as a viable option in those organizations that require development tools to run cross platform. In the past, .NET developers were shut out from those opportunities since Java was the only large-scale enterprise toolset that could ‘check the box’ regarding running cross platform.”
Posted by Richard Seeley on 08/09/20180 comments