Lamda, Go, and GotSport via API Gateway

While I've been primarily living in the mobile world recently, I've been intrigued by a the developments in a few other areas of technology.

Go, or golang, has been increasingly becoming my preferred language for side projects.  I really like the concise nature of the language, the ability to deploy on multiple platforms, and the short tool chain.  It 'just works', and is fun to program in.

AWS continues to add interesting services and features.  I recently moved my website from GoDaddy to host on S3.  This really started my thinking about living in a 'serverless world'.  While practically speaking hosting on S3 isn't really all that different than virtual hosting at GoDaddy, it is really just scratching the surface.  You can now build pretty interesting applications without running a server (virtual or otherwise).

I begin to think about how you could build a full application using the AWS stack without an EC2 instance.  Of course, a ton of thought has already been put into this.  The Serverless tool allows you to easily configure Amazon to use the API Gateway with Lambda to deploy functional APIs.  There is also a website dedicated to living serverless.

I've also been working on a few Amazon Alexa applications for the Echo, which also uses Lambda as the preferred deployment program.

So I thought it was time to build something that actually works.

My son plays soccer and the Colorado Soccer Association uses the event.gotsport.com website to post schedules and results.  So I built a screen scraper in Go to parse his schedule into JSON so I could use it in my Amazon Alexa application.  (The scraper is here: https://github.com/ericdaugherty/gotsport-scraper)  But I hard-coded the resulting JSON into that application.

I figured I could build a general API that could be used by anyone to convert the schedules into JSON.  And I could easily deploy it on Lambda.

In order to get Go running on Lambda, you need to 'work around' the fact that it isn't officially supported.  The lambda_proc library (https://github.com/jasonmoo/lambda_proc) does just that.  It uses a node.js wrapper that invokes your Go application within the Lambda runtime.  The repository has a good example that should you how to write and deploy a go app on Lambda.

From there, I just needed a simple Go app to take the input JSON, run the gotsport-scraper I wrote, and return the resulting JSON.

The final step was exposing the Lambda function as an HTTP API.  This is where the API Gateway comes in.  It allows you to specify an endpoint and method to trigger the Lambda function.  The basics are pretty straightforward.  You define a resource (/gotsport) and a method (GET), and map it to your Lambda function.  However, the tricky part is the mapping of the HTTP Request to the Lambda function, and the result of the Lambda function to the HTTP Response.

Here is the full lifecycle:



I decided to use Query String parameters to pass data to the function, so you could just cut/paste from the URL you wanted converted.  You can define the query strings in the Method Request section, but this just seems to allow you to use them when testing and isn't required.  I did have to map the query parameters into the Lambda, so I created an application/json mapping (since the Lambda function consumes JSON).  The mapping function is:
{
    "eventId" : "$input.params('EventID')",
    "groupId" : "$input.params('GroupID')",
    "gender"  : "$input.params('Gender')",
    "age"     : "$input.params('Age')"
}

This maps the Query String Parameters using their names (again, as used on gotsport.com so you can cut and paste) into JSON values that match those used by me gotsport-scraper tool.  This is then passed to the Lambda function.

The Lambda function runs, fetching the requested URL, scraping it, and returning a JSON value.  However, the lambda_proc function returns both an error value and a data value containing the results of the Lambda function.  I wanted to map the output to just contain the JSON representing the schedule.  So in the Integration Response step in the lifecycle, I used the application/json mapping function:
#set($inputRoot = $input.path('$'))
$input.json('$.data')
This just extracts the data element from the JSON returned from the Lambda function and passes it back as the HTTP Response Body.

The proper approach in the Integration Response step is to use a RegEx to determine if an error occurred or not, and return the proper HTTP response code and appropriate body.  For now I'm assuming a 200 response with valid data.

That's it!  I now have an HTML->JSON screen scraper for the GotSport website deployed as an API.

Want to see how this all works.  Here are the resources:

Want to test it out?  Grab a specific schedule from the CSA Youth Advanced League 2016 site.  Then cut/paste the query string parameters onto my API URL: https://j4p9lh1dlb.execute-api.us-east-1.amazonaws.com/prod/gotsport  For example, the U-11 Boys Super League would be: https://j4p9lh1dlb.execute-api.us-east-1.amazonaws.com/prod/gotsport?EventID=46461&GroupID=511424&Gender=Boys&Age=11



Unit Testing with throws in Swift 2.0

I've been working very slowly on a Swift version of Playlist Export, and today I finally got around to updating the project to Swift 2.0.  Luckily I wrote test cases for much of the logic to manage the export process, so I had some level of confidence that I would know if I broke anything.

However, after I got the project to compile, only one of the test cases ran.  No errors or other indication why all the other cases were ignored.

I did finally figure out that if you your test case has a throws clause, it will be IGNORED.

So:
testPlaylistNameExtension() throws {
    ...
}

will silently be ignored, which for a test case is a pretty bad scenario.  But if you handle the error in the function and change it to:
testPlaylistNameExtension() {
    ...
}

then everything is just fine.  So if you have test in Swift 2.0 that are ignored or not executed, check to see if you have a throws in the function declaration.


Home Theater

I recently got to fulfill a longtime dream of building a 'home theater'.  Luckily, the basement was already finished with a room that was large enough to house a home theater, so as home theater projects go, it was pretty straight forward.  This is a write up of the project, which I completed in May of this year, so I've had almost 6 months with it completed.

The room was previously setup as a TV room, with a 60" Rear Projection DLP TV (Mitsubishi WD-60738), a very large entertainment center I purchased for a much larger room in a previous house, and a sectional couch.

The 60" TV and too-large entertainment center:


This setup also consisted of:

Onkyo TX-SR605
3 Phase Tech PC-3 Speakers (Fronts)
2 Phase Tech PC-60 Speakers (Rears)
Phase Tech TODO Subwoofer
Sony Blu-Ray Player
DirecTV HR20-700 DVR

I also had an IR repeater setup, with most of the components already located in the utility room next door.  The system was based on the Niles IR Repeater with the receiver and emitters cobbled together over many years.

The room has three walls, with the back open to a larger room, and no windows on any of the three walls.  The room was 11.5' wide x 17' deep, which wasn't quite as wide as I would have liked, but gave me plenty of length, especially since I could spill over into the larger room behind it.

The existing room and equipment provided a solid foundation for the new home theater.  The Onkyo receiver and HR-20 were both used in the new setup.  I struggled about whether to keep the Phase Tech speakers or replace them, but ended up replacing them.  They had been 'well used' over a ~15 year lifespan, and the speaker cones were in need of repair, some of the dome tweeters were smashed, and there was some loose material in the speakers that you could hear when you lifted them.  The rears were also pretty large to be wall mounted, which made it awkward to walk past.

While researching the room, I leaned heavily on two resources:
1. The Wirecutter
2. AVS Forum

I really appreciated the Wirecutter's approach of picking the best item in a given category.  It really simplifies the decision making process, but also does give you a few options in addition to their pick so you can really understand the trade-offs that exist.  This site is how I picked the screen, projector, and heavily influenced the selection of the speakers.

For the projector, I pretty much took the Wirecutter's recommendation, and verified the selection by reading reviews on AVS Forum.  I selected the Sony VPL-HW40ES. Overall, I have been very happy with the projector.  The picture quality is great, it is relatively quiet, and has worked flawlessly so far.  I did briefly consider a 4k projector, as the HW40ES is only 1080p, but my logic was that the cost difference for early stage 4k projectors wasn't worth it, and by the time 4k content becomes readily available there will be better projectors out for much less money.

I mounted the projector to the ceiling with a set of Chief components, the CMA-101, CMS-003, and RPA-020.  This is very solid and pretty easy to get setup and aligned.

There are a lot of options for screens, but since I was in a basement and had full control over the light, I didn't need a high gain screen.  I could also use a fixed screen since it would be permanently mounted to the wall.  Again, I want with the Wirecutter's recommendation of the Silver Ticket 100" (diagonal) screen.  It takes a bit of effort to put it together, but it was straightforward and I was very pleased with the finished product.

Once I decided to replace the speakers, I really struggled with how to select the best options.  Many quality speakers are now sold directly to consumers, allowing you to skip the AV Store listening rooms.  I spent a lot of time listening to speakers 15 years ago when I purchased the Phase Tech speakers originally, and frankly I didn't want to go through that again.  I turned to the Wirecutter and AVS Forum again for advice.  In the end, I started with the Wirecutter recommendation for surround systems, although I found the runner up at the time, the NHT Absolute 5.1 system more interesting.  However, I had an existing sub that was more than adequate for the room, so I ended up with the NHT Classic Three bookshelf speakers as the fronts, the Classic ThreeC as the center, and their thin Absolute Wall Speakers as rears.  I've been very happy with them so far as well.

While I had an existing Sony Blu-Ray player, I decided to upgrade it because I wanted to move it to another room to replace an existing DVD player.  For this, I ended up going with a popular selection from the AVS Forum, the OPPO BPD 103D.  I really like this player, but I'm not totally convinced on the Darby processing.  I have the Darby signal processing applied to both the  OPPO sourced video as well as running the DirecTV video through it.  I can certainly see some differences in the side-by-side option it has, especially when you crank the processing up high.  I have mine set at 35 I believe, and mostly take it on faith that it is better.  That said, as the owner of a phenomenal early DVD player, the DVP-7000, I've been mostly disappointed in every DVD or Blu-Ray player I've purchases since then.  The OPPO is the closest thing to the DVP-7000 that I've owned since, and that his high praise.

To round out the electronics portion of the upgrade, I bought a Harmony 650 remote.  I've used Harmony remotes for as long as I can remember and I've been happy with all of them.  I was somewhat leery of moving from a Harmony One to the 650 based on some feedback I had read, but I was very please with the 650 and it does everything I need.

For the furniture, the room wasn't quite wide enough to put 4 seats and leave enough room for an aisle, so I ended up with two rows of 3 HT Design Devonshire chairs.  Since I could only fit 6 chairs, I added a bar behind the chairs with 4 stools, increasing our capacity to 10 people and providing a space to eat and socialize when we had people over to watch games, etc.  I'm generally happy with the chairs.  They are very comfortable and feel well-built.  I think the back-lit buttons are a bit too bright, and could generally do without the lights on the chairs, as guests have a tendency to turn them on without realizing it.  I need to just disconnect them, but have not gotten that far.

The riser is a bit time-consuming to build, and the instructions are horrible.  There were some YouTube videos, but they seemed to use slightly different hardware and were of minimal use.  We did re-carpet the basement towards the end of the project, so we were able to get the riser carpeted to match the rest of the room easily.  The riser is really just tall enough, although anything taller and you'd be banging your head on the ceiling when you stood up, so it works well.

I purchased all this from the following sources, and had good experiences with all:
Craig at AV Science, Inc. set me up with the projector and the Chief mounts.
Alan at HT Market set me up with the chairs, riser, and bar.
I bought the speakers directly from NHT.
The rest was purchased on Amazon or elsewhere.

Here is the finished product:






Site Migration

After hosting this site on GoDaddy for many years, I've decided to migrate it to Amazon via S3.  Amazon has some great features for hosting static websites, although I've gone 'bare bones' to start.  I'm just hosting using an S3 bucket.  I'm using the wwwizer Naked Redirect service to redirect ericdaugherty.com to www.ericdaugherty.com, and then hosting the site out of a S3 bucket.

Amazon has a DNS server (Route 53) and CDN (CloudFront) that are easy and inexpensive to use, but I don't think I need them yet.  For now, I'm still using GoDaddy's DNS Server and the wwwizer 'hack' instead of Route 53.

The previous site utilized somewhat of a 'poor man's template engine'.  I had a html file for each page on the site, but had Apache evaluate them as PHP files and I used PHP Includes to build up the page using common components.

Moving to a fully static website meant I needed a real template engine.  I selected Jekyll and migrated the site over.  It was a pretty straight forward migration and ended up reducing the size of each file as I could use a true template instead of just having common components.

I then use the AWS console tool to upload the generated website files to S3 for an easy deployment, also allowing me to finally retire FileZilla from my tool chain.

Amazon has some pretty good guides to doing this, but I also used two good blog posts: Amazon S3 on Domain Root, without Route 53 and Static website on S3, CloudFront and Route 53, the right way!

The blog portion of the site is still hosted at Blogger, which has and continues to work well.  

This also forced me to make a few updates to the site, fixing some broken links and removing some no-longer-relevant sections.

Plus it gave me an excuse to finally post on the blog.

USA Pro Cycling Challenge 2014

The USA Pro Challenge rolled through Golden this weekend, and I headed to town to try to get some shots.

See all the USA Pro Challenge 2014 pictures on Smugmug.

The race brought them through Golden and then up Lookout Mountain.  They managed this climb in less than half the time of my best effort.  They came back and did a loop through downtown Golden before heading down to Denver for the finish.

I positioned myself at the corner of Washington and 10th to catch them coming through downtown Golden, with the 'Welcome to Golden' sign in the background.

Photo & Video Sharing by SmugMug

Here is the overall winner rolling through Golden:


This lone rider nearly got clipped by one of the support cars




Photo & Video Sharing by SmugMug

Fireworks in Estes Park

I spent the 4th of July in Estes Park this year, and enjoyed a great fireworks show in front of the Rocky Mountains.










See the rest of the pictures here.

Alluvial Fan in Rocky Mountain National Park

I got to visit Estes Park and Rocky Mountain National Park this weekend, which gave me a chance to see some of the flood damage from this fall.  Nothing was quite as impressive as the amount of material deposited at the base of the Alluvial Fan in RMNP.  The river used to run down the mountain and under the bridge.  The bridge survived, but as you can see, it no longer crosses the river.  The river used to flow about 15 feet under this bridge.  Now there is rock and sand above the level of the bridge...


Instead the river has redirected a bit farther west and took out the road.  The amount of material that was brought down was impressive.  These signs both used to be normal height...



And this sign was taken out by the flow of water, sand, and rocks.


Just about everything in this picture was not here previously.  All these rocks and sand was brought down during the flood.