Tuesday, October 27, 2015

Exercises for New Developers

Over the course of my career, I have needed to guide newer developers learning either general programming practices or a specific topic. I usually have the developer work through a sample project with some specific instructions intended to push that person along a path. I was recently asked to find my notes from one of these exercises to share with another individual. Instead of just emailing the instructions, I wanted to capture them more permanently.

These write-ups are not very detailed. They are written with the assumption that a more experienced programmer is "administering" the exercise to someone, so the instructions are more like teacher's notes. The only comparison I can come up with are the Spiritual Exercises of St. Ignatius Loyola. He wrote a manual of sorts for spiritual directors to guide retreatants through spiritual "exercises" to improve their relationship with God.

As I write up these exercises, I will link them here.


Calculator Project

Calculator Project

This is the first write-up of a software development exercise that I have used to help train developers. This post should be viewed as instructions for the instructor, more than a step-by-step guide for someone trying to learn.

In this project, the user will create a basic calculator application. The specific project steps are written to encourage the user to create a program that interacts from the command line. This is done to keep the focus of the project on Test Driven Development and good programming practices. After the main exercise is described, alternate variations are mentioned.

Phase 1

Create a command-line application that will prompt the user for two numbers, add the numbers together, and return the result.

The program flow should look like:

C:\> calculator.exe
Please enter the first number: 3
Please enter the second number: 4
The sum of 3+4 is 7.
C:\>

The main() method should call the Calculate() method on another object. The program should be able to accept and correctly add any integer less than half of the max integer available on the system.

Phase 2

Modify the existing calculator program to be able to perform subtraction. The user should enter two numbers just as with addition, but will be prompted to select an operation. All of the constraints from the first phase still apply.

Phase 3

Modify the existing calculator program to be able to perform multiplication and division. If the user enters a 0 as a divisor, the program should return a message indicating that division by 0 isn't possible.

Phase 4

Modify the existing calculator program to accept floating point values (decimal values) as well as integers.

Alternate Phase A

Using the calculation code developed in the first 4 phases, implement the entire application as a web site using MVC. The scope of this approach could be expanded to have the user interface mimic a calculator's interface with number and operation buttons.

Alternate Phase B

Using the calculation code developed in the first 4 phases, implement the entire application as a REST API, with a single page application (SPA) framework such as AngularJS as the user interface.

Saturday, October 24, 2015

Creating a Sample ASP.NET 5 App to Run in DNX

In my on-going quest to better understand the .NET Execution Environment (DNX) and all of its goodness, I've been creating sample applications. Recently I've been experimenting with the Yeoman generators for ASP.NET. In this post, I want to introduce Yeoman and run through the steps necessary to quickly create a simple MVC application that can be run with DNX.

Yeoman is billed as "The web's scaffolding tool for modern webapps." It is built on top of NodeJS and uses generators to create whatever project or file is requested. A developer calls Yeoman, specifying a generator, to kickstart new projects, encapsulating all of the necessary details for a given technology stack.

Microsoft has had a similar scaffolding concept built into the File > New Project menu item in Visual Studio. Most .NET developers expect this type of templating to be available to them. Since DNX is enabling cross-platform development in the .NET stack, and Visual Studio is not cross platform, Microsoft is leaning on Yeoman to provide the scaffolding for ASP.NET 5.

Installing Yeoman is very easy once NodeJS, especially the Node Package Manager (npm), is installed. (NodeJS can be downloaded from the NodeJS web site.) Open a command-prompt and execute the following command:

npm install -g yo
This command installs Yeoman globally for everyone. To call Yeoman, execute the command
yo
from a command prompt. It will display a text menu of options, including generating applications with any installed generators.

Once Yeoman is installed, the generators for ASP.NET 5 need to be installed. They are installed by executing the command

npm install -g generator-aspnet
This command installs the ASP.NET 5 generators globally.

The following steps can be taken to use Yeoman to create a basic MVC 6 application that will run in the DNX. These instructions assume you already have the DNX installed.

  1. Call Yeoman:
    yo aspnet
    1. Select Web Application Basic from the menu presented and hit Enter.
    2. Name your application HelloWorldMvc at the next prompt.
  2. Change directory to your new MVC application folder:
    cd HelloWorldMvc
  3. Restore NuGet packages:
    dnu restore
  4. Run the application:
    dnx kestrel
  5. Navigate to the application in your browser by going to http://localhost:5000
  6. You should see the default page with the name HelloWorldMvc in the upper left-hand corner.

Congratulations! You have created and executed an ASP.NET 5 application in the .NET Execution Environment.

Thursday, September 24, 2015

Understanding DNX

Getting a handle on the new .NET Execution Environment (DNX) is difficult. I've been following the project for a while, and still get twisted around trying to explain it to people. I believe the following analogy and diagram help to understand the environment.

Keep in mind, this is a tenuous comparison meant to help people understand the environment. It is not a comparison of the technologies in question.

I often compare DNX and its ecosystem to the MEAN development stack (MongoDB, ExpressJS, AngularJS, and NodeJS). This development stack consists of a runtime environment (NodeJS) that hosts a framework for creating server-side applications (ExpressJS). These server-side applications can be accessed using client-side frameworks such as AngularJS. The entire stack can be backed with MongoDB for permanent storage.

The DNX environment is similar, but much richer. DNX is the runtime environment, much like NodeJS is for the MEAN stack. MVC can be used to create an API (formerly WebAPI*) that runs on the server in DNX, much like ExpressJS is used in the MEAN stack. The client-side component can still be AngularJS that calls the API. This is where the comparison ends.

DNX is unique in that it requires an implementation of .NET to be available in order for it to serve up applications. As of the Beta7 release, it can use either Mono or CoreCLR. This requirement allows software developers to completely control the environment in which their application lives, by choosing either Mono or CoreCLR as the backing .NET implementation.

The best part is that all of this can be run on Windows, Mac, and Linux! DNX enables developers to completely control their application's environment and runtime.

In my analogy, I left out the backing database technology in the DNX ecosystem. As the CoreCLR advances, it will enable applications to access SQL Server natively. This capability means that an app running on a non-Windows operating system will have "native" access to SQL Server.

Below is a basic diagram showing how the DNX landscape looks. Not all of the functionality depicted is currently available, but an awful lot of it is.

Since the DNX environment is changing rapidly, I would appreciate any feedback as to anything I have stated that is incorrect.

Tuesday, September 15, 2015

First Node.js App in Azure

I read Ted Neward's article about deploying a simple Node.js app to Microsoft Azure and wanted to give it a try.

The steps mostly worked, except for a few issues. Below are the steps that he lays out. I am assuming that you have clicked the link and read the article. He does a much better job explaining everything. In this post, I wanted to note the changes necessary with the steps.

  1. In a local directory, create the following Javascript file. This file creates a basic web server using Node.js.
           var http = require('http');
           var port = process.env.PORT || 3000;
           http.createServer(function(req, res) {
             res.writeHead(200, { 'Content-Type': 'text/plain' });
             res.end('Hello World');
           }).listen(port);
         
    In the article, this script is named "helloHTTP.js." For the file to work in Azure, it needs to be named "server.js".
  2. Test the script using Node.js
    1. Run the script using Node.js:
      node server.js
    2. Open the URL http://localhost:3000 in your browser.
    3. You should see the string "Hello World".
  3. Install the azure-cli tools:
    npm install -g azure-cli
  4. Access your Azure account:
    azure account download
  5. Import the file that was just downloaded:
    azure account import 
  6. Create the site with git:
    azure site create --git
    In the article, the argument for the git option only showed a single dash. Also, this step will prompt you for a site user name and region. The name you choose must be unique across Azure. Pay attention to the messages too. It will give you the full URL necessary for accessing the site once it is created.
  7. Add the script file:
    git add server.js
  8. Commit the file:
    git commit -m "Initial site creation."
  9. Push the site to Azure:
    git push azure master
At this point, you should be able to navigate to the site name created above, and see "Hello World" generated by a Node.js script in Azure.

Sunday, August 2, 2015

Highlights of My Faith Journey

I decided to write this for the benefit of some people that I know that may have lost their faith in God. This is not an extensive treatise on my faith journey, but a quick summary of the major elements that have me where I am.

Shortly after I was born, my mom noticed that my head didn't look right. After consulting with the pediatrician, they decided my condition required further investigation. It looked like the right side of my head was caved in. It was so noticeable, that baby pictures of me all have me in a hat or some kind of head covering, and I'm turned to show the left side. My head was not right.

Consulting with a neurologist lead to the diagnosis of craniosynostosis. It is a condition where the plates in the skull fuse together prematurely, causing the rest of the skull to develop incorrectly. At the insistence of my pediatrician, it was decided that I would undergo surgery.

The surgery was performed and successful. Other than a really nasty scar on the side of my head, you really can't tell there was ever a problem.

Growing up, I always did well in school. Through grade school and high school, I was in the top 3 of my class. Despite successes in school, I was not a great athlete. I was even worse as making and keeping friends. Quite often, I was picked on in school.

I always struggled with the conversations about how everyone has a talent or gift. I wasn't good at sports, I don't have any musical talent, and any talents I did seem to have always drew ridicule. It was not uncommon for me to get down on myself and wonder if I was really meant to be here. (Just to be clear, I never contemplated suicide, just challenged what my purpose on earth was.)

The feelings of not being meant to be here lingered on and off into my adult life. That is, until my third son came along.

My third son came along, and was a fantastic infant. My mom would always ask about a spot on his forehead that didn't look right, and my wife and I would say it looked fine. After a LOT of asking, we decided to ask our pediatrician about it. She didn't see anything, but recommended an MRI to be safe. The MRI revealed that my son also had craniosynostosis! It was a different spot on his head, but he had the same thing.

All of a sudden, I'm in a whirlwind of doctor's visits that lead to the decision for my 9-month-old son to have the same kind of surgery. One afternoon, my wife and I were sitting in the office of a plastic surgeon with our son, talking to the plastic surgeon and a neurosurgeon about the surgery my son was about to undergo. Growing up, my mom always described my surgery as a doctor cutting open my head, inserting a biodegradable plastic thingy, and stitching me up. Here is how my son's surgery was described to me: the plastic surgeon would make an incision from ear to ear on my son's scalp, remove the fused plates on the top of his skull, break them apart, and put them back together with effectively a biodegradable erector set. What?!? The neurosurgeon would be present to make sure the membrane separating the skull from the brain wasn't damaged and no other harm came to the brain or nervous system. What?!? The entire surgery would take about 3 hours.

From there, there were numerous hours of planning an preparation to have my son in the hospital with my wife staying with him. Myself and family members donated blood to have on hand just in case. It was overwhelming.

On the day of the surgery, we had to take him to a pre-op room to get him ready. The anesthesiologist gave him a medicine mixture to start slowing him down. Then a nurse took him from my wife's arms and walked him to surgery. We were left to wait in a waiting room for 3 hours. It was the worst 3 hours of my life!

The surgery was successful! Needless to say, during the surgery and since, I have reflected on this quite a bit. My son's surgeons were quite skilled. The neurosurgeon was well-known for his work, and the plastic surgeon was well-known for helping kids with cleft palates. These miracle-workers had done this surgery a lot, and were prepared for everything. When this surgery was done on me, it was not well-known or well-practice. The doctor that operated on me was knowledgeable of neurosurgery, but I don't think he was well-practiced in fixing craniosynostosis. The only conclusion I could draw from these experiences is that my life, like all lives, is a gift from God. He punctuated it for me by carrying me through a very risky surgery when I was just 7 months old. He left a scar on me to remind me of my gift.

I am an imperfect human being, prone to sin. I don't always live my life as I should. I'm not always the beacon of Christ's love for me to others. My I am much more aware of my life as a gift, and I am working daily to be a better disciple.

I hope that my journey to realizing the great gift I've been given will help others recognize their lives as tremendous gifts, given to them by a loving God.

Tuesday, July 7, 2015

Building/Running the omnisharp-roslyn Project

I've been trying to get involved in the Omnisharp project. Omnisharp is actually a set of projects that are tools to provide many of the experiences present in Visual Studio for other editors like Emacs, Vim, or Sublime. This post is me capturing some of the steps I've gone through to get a part of the Omnisharp project building. Most of this post assumes the reader has an understanding of the DNX environment and its utilities.

Omnisharp has two primary components: a server and a client. The server provides HTTP endpoints that return information about a code base. The client(s) is aa plugin for the user's editor-of-choice that calls the server and uses the data to provide syntax highlighting, code completion, refactorings, etc. The Omnisharp server started life in the omnisharp-server Github project. The latest version is now in the omnisharp-roslyn Github project. It's this project that I've been trying to build.

omnisharp-roslyn provides a build script (either build.cmd or build.sh) to make this a no-brainer. This script tries to set up the local environment, update the required packages, and publish the server. All of this relies on the .NET Execution Environment (DNX). There is nothing wrong with the build script from what I can see. Try it first. I ran into errors with the build script and had to do a few things step-by-step.

Below are the steps I ran to get the omnisharp-roslyn bits running from the code:
dnvm upgrade -unstable
This installed dnx-mono.1.0.0-beta6-12174.
dnvm exec 1.0.0-beta6-12174 dnu restore
This pulled down all the necessary packages.
dnvm use 1.0.0-beta4
This is the version of DNX that omnisharp-roslyn needs.
dnu restore
I'm not sure why.
./scripts/Omnisharp
This runs omnisharp!

You can have Omnisharp index a particular folder or solution file (SLN) by calling:
./scripts/Omnisharp -s /path/to/sln/or/folder

Wednesday, June 17, 2015

Splunk Searching and Grouping

I'm starting to play with Splunk searching. We process multiple large files every night as the bulk of our work. Our system writes numerous log messages during this processing activity. Every file processed gets a unique identifier. I needed to sort through these logs and group all entries for a given file together. Here is the search I used:
application="MyApp" logger="LoggerName" message="*some part of message*" | rex field=message ".part . (?\d+).*" | transaction id
The first section of the search (application="MyApp" logger="LoggerName" message="*some part of message*") represents the search terms in Splunk. The second portion is a regular expression (rex field=message ".part . (?\d+).*") that is parsing a value out of the message field and giving it the name "id". The last part (transaction id), groups all of the records that are found with the search terms by the "id" value.

Sunday, June 7, 2015

Raspberry Pi as Google Cloud Print Server

I made my Raspberry Pi 2 into a Google Cloud Print server today. I wanted to capture the links I used to set this up. There were two blog posts that I followed. The first thing I had to do was to add a printer to the Pi. This link walks you through the setup. Once the Pi knew about my printer, I could configure the Google Cloud Print service. This post walks you through that set up.

Friday, May 22, 2015

How to Think About Application Logging

A colleague of mine talked about how Splunk has changed our way of thinking about application logging. The analogy he used was that we used to think about logging as storytelling (i.e. Job1 is starting, Doing task 1, Doing task 2, ..., Job1 finished). It is much like "In the beginning... something happened... then something else happened... The End." Splunk has gotten us into the habit of thinking about logging as a series of statements that can be grouped together with the Splunk search tools (i.e. Job=1, Message=Doing something; Job=1, Message=Doing something else; ...). This is a powerful shift in thinking that allows us to better troubleshoot and diagnose system issues.

Thursday, May 21, 2015

Consolidation

I have posted to a blog minimally over the past several years. Over those years, I've posted technology and career related posts to one blog and faith related posts to another blog. The split-brained approach has never felt natural to me. To make it easier for me to post things, I'm creating a single place to blog. Over time, I may bring over some of the posts, especially the software development ones. Mostly this new blog is an attempt to make it easier for me to post.