Ubuntu / Raspian / Rapsberry Pi Connection Problems

So today I was messing around with a Raspberry Pi using Raspian. It turns out if you have an ethernet connection without Internet and a Wi-Fi connection with Internet, the Pi / Raspian is too dumb to figure out that it should use the Wi-Fi connection for things like dns lookups and web traffic. Slightly annoying. I’m sure there’s a way to fix this – but time is not on my side today.

Update: It turns out it’s more of a widespread linux problem. This morning on Ubuntu I tried connecting one Wi-Fi adapter to an AP that was not connected to the Internet and a second one that was. It seemed that again the traffic tried to take the path through the first adapter.

Ubuntu: Script to Check Internet Connection and Repair

At school there are a set of laptops which I occasionally run experiments on for my research. I usually like to work remotely as I travel a bit and live in other cities. These laptops are all connected to the university network through a wireless AP in my office which passes the DHCP requests to somewhere in the department. This lets each one get an external IP which is extremely useful for SSH-ing into the machines one at a time remotely. However sometimes, for unknown reasons the IPs revert back to 10.x.x.x addresses and aren’t reachable. The problem can be solved by releasing the old address a couple of times:

sudo dhclient -r wlan1
sudo dhclient -r wlan1

and then asking for a new address:

sudo dhclient wlan1

However, this isn’t too helpful if I’m out of the lab. So to automate this I came up with this script which can be turned into a cronjob:

(more…)

Updated DNS Zone Update tool for Hostmonster

A couple years ago I posted a slightly modified script for hostmonster to update the dns zone entries for subdomains. It used mechanize and ruby. However, since then my script broke as hostmonster made some changes to their backend. Another guy made some changes that seemed to work for a while, but again it has broken. My previous post about this is here.

The culprit seems to be the mixture of javascript / ajax and the fact that hostmonster returns an empty page when you append the /ajax onto the dnszone url. No worries though, I found another way to do it.

Using Watir and Headless, it is possible to achieve the same functionality.

You need to add a few things to ubuntu to make it work:
sudo apt-get install ruby ruby-dev xvfb
sudo gem install watir
sudo gem install headless

Here’s the updated script (note, you can still use the same cron jobs and ip scripts etc. from the previous two techniques.:

#!/usr/bin/ruby
(more…)

More startup news, wrapping up the PhD

So lately, I’ve been focused more on startups than I have on computer science. Probably because of the momentum I got from I got from the Guelph Opendata Hackathon, the CODE contest, the Hub pitch, entry into the Hub accelerator, Democamp Guelph, the Conestoga Centre for Entrepreneurship, and lastly the Montreal International Startup Festival / FounderFuel.

Despite this, I have been chugging away at my thesis making small incremental progress. The goal is to finish writing the dissertation by the end of summer (August) and then come back to Guelph in the fall to defend. It’s great to be reaching the end of this long process and very exciting to have so much opportunity in the future. Check back soon as I’ll likely be posting more technical computer science types of posts in the coming weeks as I try to focus on the dissertation and wrap up quickly. I’m wrapping up all the loose end projects I’ve been a part of the last couple of years and it’s going to be a sprint to the end.

IEEEE CCECE 2014 and Cognitive Agent Simulation

This past week I attended IEEE Canadian Conference on Electrical and Computer Engineering (CCECE2014) in Toronto, Canada. I was there because of two papers I was a co-author on. The work was part of a side project I was involved in on cognitive agent simulation. The idea was inspired by the observation that in the spring time, many newborn animals are struck as they cross the street. Later in the year, it seems that fewer animals meet this fate. So have the animals which survived observed the doomed creates being struck, learned something about the environment, and managed to become more intelligent? We aimed to model this type of environment, and then re-create the most basic intelligence to try to replicate this behaviour with a cognitive agent.

I was responsible for implementing the simulation tool and the naiive learning algorithm which we also presented at the conference. The simulator was created in c/c++ and was designed in such a way that later on the intelligence algorithms could be swapped out, so that we could also experiment with more sophisticated learning algorithms.

(more…)