Tuesday, July 7, 2015

Building/Running the omnisharp-roslyn Project

I've been trying to get involved in the Omnisharp project. Omnisharp is actually a set of projects that are tools to provide many of the experiences present in Visual Studio for other editors like Emacs, Vim, or Sublime. This post is me capturing some of the steps I've gone through to get a part of the Omnisharp project building. Most of this post assumes the reader has an understanding of the DNX environment and its utilities.

Omnisharp has two primary components: a server and a client. The server provides HTTP endpoints that return information about a code base. The client(s) is aa plugin for the user's editor-of-choice that calls the server and uses the data to provide syntax highlighting, code completion, refactorings, etc. The Omnisharp server started life in the omnisharp-server Github project. The latest version is now in the omnisharp-roslyn Github project. It's this project that I've been trying to build.

omnisharp-roslyn provides a build script (either build.cmd or build.sh) to make this a no-brainer. This script tries to set up the local environment, update the required packages, and publish the server. All of this relies on the .NET Execution Environment (DNX). There is nothing wrong with the build script from what I can see. Try it first. I ran into errors with the build script and had to do a few things step-by-step.

Below are the steps I ran to get the omnisharp-roslyn bits running from the code:
dnvm upgrade -unstable
This installed dnx-mono.1.0.0-beta6-12174.
dnvm exec 1.0.0-beta6-12174 dnu restore
This pulled down all the necessary packages.
dnvm use 1.0.0-beta4
This is the version of DNX that omnisharp-roslyn needs.
dnu restore
I'm not sure why.
./scripts/Omnisharp
This runs omnisharp!

You can have Omnisharp index a particular folder or solution file (SLN) by calling:
./scripts/Omnisharp -s /path/to/sln/or/folder

Wednesday, June 17, 2015

Splunk Searching and Grouping

I'm starting to play with Splunk searching. We process multiple large files every night as the bulk of our work. Our system writes numerous log messages during this processing activity. Every file processed gets a unique identifier. I needed to sort through these logs and group all entries for a given file together. Here is the search I used:
application="MyApp" logger="LoggerName" message="*some part of message*" | rex field=message ".part . (?\d+).*" | transaction id
The first section of the search (application="MyApp" logger="LoggerName" message="*some part of message*") represents the search terms in Splunk. The second portion is a regular expression (rex field=message ".part . (?\d+).*") that is parsing a value out of the message field and giving it the name "id". The last part (transaction id), groups all of the records that are found with the search terms by the "id" value.

Sunday, June 7, 2015

Raspberry Pi as Google Cloud Print Server

I made my Raspberry Pi 2 into a Google Cloud Print server today. I wanted to capture the links I used to set this up. There were two blog posts that I followed. The first thing I had to do was to add a printer to the Pi. This link walks you through the setup. Once the Pi knew about my printer, I could configure the Google Cloud Print service. This post walks you through that set up.

Friday, May 22, 2015

How to Think About Application Logging

A colleague of mine talked about how Splunk has changed our way of thinking about application logging. The analogy he used was that we used to think about logging as storytelling (i.e. Job1 is starting, Doing task 1, Doing task 2, ..., Job1 finished). It is much like "In the beginning... something happened... then something else happened... The End." Splunk has gotten us into the habit of thinking about logging as a series of statements that can be grouped together with the Splunk search tools (i.e. Job=1, Message=Doing something; Job=1, Message=Doing something else; ...). This is a powerful shift in thinking that allows us to better troubleshoot and diagnose system issues.

Thursday, May 21, 2015

Consolidation

I have posted to a blog minimally over the past several years. Over those years, I've posted technology and career related posts to one blog and faith related posts to another blog. The split-brained approach has never felt natural to me. To make it easier for me to post things, I'm creating a single place to blog. Over time, I may bring over some of the posts, especially the software development ones. Mostly this new blog is an attempt to make it easier for me to post.