Archive for the ‘Open Source’ Category

iOS: Detect Personal Hotspot

July 22, 2011 2 comments

When you want to detect the type of available connections on an iPhone, the best resource you can find on the web is the sample code from Erica Sadun’s excellent iPhone Cookbook book (which I can wholeheartedly recommend). The sample code can be found on github (look into 02 and 03):

While the solution presented is great, it fails to work on an iPhone 4 that has the Personal Hotspot feature enabled. In this scenario, the iPhone will create a network interface called “ap0” that bridges through to “en0” (WiFi) and “pdp_ip0” (3G) . Since “en0” will not be marked as AF_INET interface in this scenario, the approach Erica outlined will fail here. Here’s a dump of the available interfaces, their loopback and AF_INET status and their assigned address:

2011-07-22 12:59:07.120 RowMotion[286:707] name: lo0, inet: 0, loopback: 0, adress:
2011-07-22 12:59:07.126 RowMotion[286:707] name: lo0, inet: 0, loopback: 0, adress:
2011-07-22 12:59:07.129 RowMotion[286:707] name: lo0, inet: 1, loopback: 0, adress:
2011-07-22 12:59:07.134 RowMotion[286:707] name: lo0, inet: 0, loopback: 0, adress:
2011-07-22 12:59:07.137 RowMotion[286:707] name: en0, inet: 0, loopback: 1, adress:
2011-07-22 12:59:07.141 RowMotion[286:707] name: ap0, inet: 0, loopback: 1, adress:
2011-07-22 12:59:07.145 RowMotion[286:707] name: pdp_ip0, inet: 0, loopback: 1, adress:
2011-07-22 12:59:07.149 RowMotion[286:707] name: pdp_ip0, inet: 1, loopback: 1, adress:
2011-07-22 12:59:07.154 RowMotion[286:707] name: pdp_ip1, inet: 0, loopback: 1, adress:
2011-07-22 12:59:07.157 RowMotion[286:707] name: pdp_ip2, inet: 0, loopback: 1, adress:
2011-07-22 12:59:07.161 RowMotion[286:707] name: pdp_ip3, inet: 0, loopback: 1, adress:
2011-07-22 12:59:07.165 RowMotion[286:707] name: en1, inet: 0, loopback: 1, adress:
2011-07-22 12:59:07.168 RowMotion[286:707] name: bridge0, inet: 0, loopback: 1, adress:
2011-07-22 12:59:07.172 RowMotion[286:707] name: bridge0, inet: 1, loopback: 1, adress:

See that last line? Yep, that’s the bridge interface we need to use to communicate with other devices on our “personal hotspot”. Here’s how to ammend Erica’s code to make personal hotspots transparent:

// Matt Brown's get WiFi IP addy solution
+ (NSString *) localWiFiIPAddress
    BOOL success;
    struct ifaddrs * addrs;
    const struct ifaddrs * cursor;
    success = getifaddrs(&addrs) == 0;
    if (success) {
        cursor = addrs;
        while (cursor != NULL) {
            NSString *name = [NSString stringWithUTF8String:cursor->ifa_name];
            NSLog(@"available network interfaces: name: %@, inet: %d, loopback: %d, adress: %@", name, cursor->ifa_addr->sa_family == AF_INET, (cursor->ifa_flags & IFF_LOOPBACK) == 0, [NSString stringWithUTF8String:inet_ntoa(((struct sockaddr_in *)cursor->ifa_addr)->sin_addr)]);
            // the second test keeps from picking up the loopback address
            if (cursor->ifa_addr->sa_family == AF_INET && (cursor->ifa_flags & IFF_LOOPBACK) == 0) 
                if ([name isEqualToString:@"en0"] || [name isEqualToString:@"bridge0"])  //  Wi-Fi adapter, or iPhone 4 Personal hotspot bridge adapter
                    return [NSString stringWithUTF8String:inet_ntoa(((struct sockaddr_in *)cursor->ifa_addr)->sin_addr)];
            cursor = cursor->ifa_next;
    return nil;

+ (BOOL) activeWLAN
    return ([self localWiFiIPAddress] != nil);

+ (BOOL) activePersonalHotspot

    // Personal hotspot is fixed to 172.20.10
    return ([self activeWLAN] && [ hasPrefix:@"172.20.10"]);

+ (BOOL) activeWLAN
    return ([self localWiFiIPAddress] != nil);

+ (BOOL) activePersonalHotspot
    // Personal hotspot is fixed to 172.20.10
    NSString* localWifiAddress = [self localWiFiIPAddress];
    return (localWifiAddress != nil && [localWifiAddress hasPrefix:@"172.20.10"]);

I hope this will find it’s way into the sample code soon. Pull request is pending.

LLVM/Clang Code Coverage about to come

July 7, 2011 Leave a comment

According to information from this LLVM bug report, Nick Lewycky recently implemented support for generating gcov compatible coverage files from LLVM/Clang. I’m not keen to replace my local LLVM with a svn build, but I’m really looking forward to finally ditch gcc.

Categories: Open Source

SubSpec available on NuGet

May 27, 2011 Leave a comment

SubSpec is finally available as a NuGet package. See on how to get started with NuGet. Once you have NuGet installed, it’s a simple matter of running Install-Package SubSpec or Install-Package SubSpec.Silverlight from the Package Manager console to get SubSpec integrated into your project.

Integrated into your project you said? You mean “get the dll and reference it”? No, in fact, deployment as a separate dll is a thing of the past for SubSpec. SubSpec is an extremely streamlined extension of xUnit and as such it fits into less than 500 lines of C# (excluding xmlDocs). This approach has several advantages:

  1. Faster builds, 500 lines of C# are faster to compile than resolving and linking against a library
  2. It fosters the creation of extensions (which is extremely common, at least in my usage of it)
  3. No need to get the source separately, you already have it!
  4. Experimental extensions can be easily shared as single files too, such as Thesis, AutoFixture integration…

I hope you like the new packages, please feel free to upvote SubSpec and SubSpec.Silverlight on the NuGet gallery and feel encouraged to write a review.

Building Mono.Cecil

September 7, 2010 Leave a comment

Building for .NET 3.5

If you want to use Cecil in a 3.5 project, you need to define the NET_3_5 symbol and change the target framework to 3.5 in Mono.Cecil.csproj.

Running the test suite:

Cecil will build fine after checkout from [|jbevain’s github repo], the test suite will not however.

The following steps are necessary to sucessfully build and run the cecil test suite:

  1. Add the Framework SDK (PEVerify, ILDasm) and the Framework install directory (ILAsm)to your PATH variable (we need the 4.0 tool set because the tests run over a few 4.0 assemblies): C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\Bin\NETFX 4.0 Tools;C:\Windows\Microsoft.NET\Framework\v4.0.30319
  2. Install NUnit 2.4.8 (its a little outdated but Mono compatible):
  3. The tests can’t be run using Ad-hoc TD.Net but must be run using the NUnit runner. There’s a NUnit GUI project in the projects root.
Categories: .NET, Open Source

SubSpec: A declarative test framework for Developers

August 23, 2010 Leave a comment

In my last post I described Acceptance Testing and why it is an important addition to the developer-centric way of integration and unit testing.

I also described that Acceptance Tests should  be as expressive as possible and therefore benefit from being written in a declarative style. From learning F# at the moment, I came to the conclusion that writing declarative code is the key to avoid accidental complexity (complexity in your solution domain that is not warranted by complexity in your problem domain). But not only acceptance tests benefit from a declarative style, I do also think that it helps a long way to make unit and integration tests easier to understand.

SubSpec has originally been written by Brad Wilson and Phil Haack. It was their motivation to write a framework that enables xUnit based BDD-Style testing. Given my desire to support a declarative approach for writing tests at all layers, I decided to fork the project and see what can be accomplished. I’m actively working on it and the code can be found on my bitbucket site. I like the idea of having a vision statement, so here is mine:

SubSpec allows developers to write declarative tests operating at all layers of abstraction. SubSpec consists of a small set of primitive concepts that are highly composable. Based on the powerful xUnit testing framework, SubSpec is easy to integrate with existing testing environments.

Here’s a short teaser to show you how expressive a SubSpec test is:

Notes on .NET Testing Frameworks

August 19, 2010 Leave a comment

When I was introduced to TDD the first testing framework I used was MsTest. Why? Because it was the next best thing available and it had the nice (beginner-) benefit of Visual Studio Integration. Soon after that, when I first hit the limitations of MsTest, MbUnit became my testing framework of choice. At the time I evaluated the existing testing frameworks, I came to the conclusion that MbUnit was the project with the most sophisticated facilities to write tests at all layers (unit, integration, scenarios) and was growing at a remarkable pace.

A year later, at  InishTech I started to use xUnit. There are several things about its design that I like better than what I have seen in other testing frameworks so far:

  • Classes containing tests can be plain C# classes, no deriving from a base class, no special attributes
  • No setup/teardown methods, instead convention based “Fixture” injection (using IUseFixture<T>)
  • No useless Assert.x( ) overloads that take custom messages, instead well formatted test output
  • Assert.Throws(Func<>) instead of [ExpectedException] attributes. Gives finer grained control over the location an exception is expected.
  • Clear, concise terminology. A test is a [Fact], a parameterized test is a [Theory]

To come to a conclusion, I think xUnit is a strong and lightweight testing framework.
MbUnit carries the massive overhead of the Gallio Plattform, making it’s test runner considerably slower than xUnits’. What I do especially like about xUnit is that it is an opinionated framework, it tries to force you into a certain way of thinking and thereby avoid common mistakes. From an extensibility point of view, xUnit has a lot to offer and I find the clear API and the few concepts it is built on compelling. Unfortunately I have no experience extending MbUnit, but extending xUnit is really, really easy.

Categories: .NET, Open Source, Testing

Subversion Client Evaluation

April 19, 2010 2 comments

Even though I am a huge believer in DVCS, for some projects I am still bound to use subversion.  I have development environments set up under Windows (primarily .NET), Mac OS (Mono and iPhone) and Linux (Mono, Haskell), so I’d like my tools to be nearly identical on all three platforms to reduce any friction not directly related to writing code.

I use version control on any project I do, some of them being cross-platform projects (e.g. mono ports), so having frictionless access to my VCS on any platform is of primary concern to me. All version control systems I happen to use at the moment run on all platforms and have good (svn, git) or excellent (hg) command line utilities, but I like the comfort of a GUI. Especially for browsing history and diffing or merging, graphical tools are irreplaceable.

Finding a good multi-platform VCS Client GUIs for subversion is not very easy, I have evaluated quite a few of them but haven’t reached any final conlusion. Here are my notes on each:

Qt based, only very basic repository operations, no stable version available, development has stalled in 2007. No recent releases (SVN 1.6.x) available.

Qt based, lightweight and solid. Actively developed, recent releases available. Easy project set-up and configuration. The GUI is very streamlined and sufficient for day to day use. Very intuitive. No merge support.

WxWidgets based, development has stalled in late 2009. There might be future maintenance but so far it doesn’t look very promising. Ugly GUI. Haven’t tried any further.

Qt based, rich feature set and actively developed. The user interface is not very intuitive but you’ll get used to it. I found refreshing the repository status to be very slow. So far the most complete OSS choice (including merge support). I’d prefer QSvn for the standard tasks.

Java based, sadly enough it’s commercial. You’ll get a 30-days trial for the pro version, after that only features of the “Foundation” version are enabled. I have found it to be very intuitive, stable and fast. SmartSVN has some features that will make life with SVN easier such as prepared commits (think of a private patch queue, unfortunately not in the free edition) and a graphical revision graph. Very good merge support.

In principle I am a fan of stand-alone utilities as I think of coding and version control as sequential tasks. Having to collect my changes manually forces me to review them once more before finally comitting, which is a good thing. Subversion is a very mature VCS and has been around for very long, so I expected to find some decent OSS Clients for it. To be honest, the existing projects seem to be far behind here. The commercial SmartSVN is the best stand-alone client I have seen.

Given this situation, I am heavily leaning towards using IDE integrated tools for subversion. In contrast to a DVCS, using a centralized system takes away more than enough freedom anyway, so using IDE integrated tools won’t hurt any further. Most of the IDEs I use provide native Subversion support (XCode, MonoDevelop, Eclipse) or via a plugin (AnkhSVN for Visual Studio). The supported features are equivalent to what a simple tool like QSvn has to offer, for everything else I’ll probably use SmartSVN.

New Project: DirectoryVersioningService

March 15, 2010 Leave a comment

My brother asked me the other day whether there’s any software that can keep track of a directories’ content and automatically create a backup on each change. He works at an equipment supplier for events (sound, light and rigging etc.) and they have  a software for managing their inventory and rental business. This software generates a variety of reports using the List&Label report engine, which is driven by report templates that are stored on a network share. The templates that shipped with the software didn’t look nice, nor were they sophisticated enough to capture all the required information, so they find themselves messing with the report templates very often. And from time to time they break something. Figuring out what exactly broke is a time-consuming process and is especially annoying when you sit right next to a customer and simply want to check-in some equipment and give him an invoice.

“This is a perfect use case for Mercurial”, came immediately to my mind. The idea was to have a Windows Service monitor the directory on the network share using File System events and perform a commit on each change. I did a little googling just to check if there’s anyone who’d done this before but didn’t find anything useful. Four solid hours later, I had the first version of my DirectoryVersioningService ready, including a simple GUI to install/uninstall and rename the service, so you can install multiple copies of it in order to monitor different directories. A side effect of this is that I do now know how Windows Services work. Especially the installing and uninstalling process takes a little time to grasp, but it’s easy once you’ve got it.

Each time a change is made to the directory, a timer starts and is set to execute a commit in 5 seconds. This is because operations like moving or renaming a file cause several file system events to be triggered and committing intermediate states is not what we want. If an event is triggered while the timer is already running, the timer is restarted, effectively establishing a 5 seconds “threshold” before a commit. The versioning service needs to track added, removed and renamed files automatically too, so the following mercurial commands are issued for each commit:

hg addremove -s 50

hg commit -m “Automatic commit.”

So far, it works pretty well. You can find the code and an executable download at my bitbucket repository. It’s bare bones at the moment and I haven’t had time to write usage instructions but will get around it after my exams.

There are a couple of alternatives though. One solution is to use a Versioning File System. Sadly enough, there is none supported on the Windows platform. Another possibility is using some commercial software like FileHamster. Both solutions don’t feel right to me. From what I know about VFS, the tool support is very immature and it would require setting up a samba server. Commercial software costs money, may have bugs I can’t fix and is yet another tool people have to learn. From looking at it, I get the impression it’s more like version control done badly. Nothing that any mature VCS couldn’t do better.

Categories: Open Source, Projects

wxMaxima CAS is Amazing

March 4, 2010 Leave a comment

Yesterday I forgot my beloved Casio FX991-ES calculator at my friends’. At the moment I am very busy preparing for my math exams at March 19th, so this unfortunate situation had to be resolved somehow.

The Mac OS X Snow-Leopard on my surf-station (it’s a MacMini if you wonder) has a cheap scientific calculator. It’s neither capable of calculating functions, nor can you enter complete expressions but have to work with intermediate results as only a single expression can be entered at a time. It follows the value->operator->result approach, I am not all too deep into calculator techniques, so I don’t know how that’s called exactly.

In contrast the Casio-FX991-ES features a sophisticated 96*96 dot matrix display which you can use to enter complex expressions using graphical representations. You can do pretty much everything I need for school with this calculator, it has a great matrix mode and the statistics features are also very nice. Additionally it has shortcuts for the 40 most important natural constants, so I don’t have to type e=1.6*10E-19 all the time. Only downside is that it’s a little slow calculating approximations for integrals or iterating the newton algorithm, but that’s okay as I don’t need to do that very often.

I googled a little and found an Open-Source CAS called Maxmima. It’s cross platform but unfortunately it’s only a command line application, so I needed to find a good GUI for it. Since I do extensively work on MacOS, Windows and Linux at the moment and do often switch back and forth, a uniform cross platform GUI is a big bonus for me (as it was with Diffmerge, a merge/diff tool in one of my recent posts). The best GUI I could find meeting these requirements was wxMaxmima, a LaTex based front end built on wxWidgets.

Unfortunately the input formatting is not as nice as it’s on my FX991-ES, however the results returned by Maxima are (graphically) formatted using the jsMath TeX font  (which I do strongly recommend to install, it won’t get installed automatically. See the instructions.)

What I do really like about wxMaxmima is the ability to create “interactive” documents, as they demo with their great tutorials. As a wxMaxima document is basically a LaTeX document you can apply all the crazy formatting, can include interactive GNUPlot diagrams and a lot more.

It has a very clean syntax and supports all the important algorithms you need and it’s easily programmable if it doesn’t. Particular cool is the ability to simplify expressions. Another thing I which I think is really great is its great menu structure and an auto-completion for all the available commands, making it feel a little like a good IDE.

Categories: Open Source, Tools

Modelshredder: Tracking down InvalidProgramException

January 1, 2010 Leave a comment

I received my first bug report for modelshredder the other day. When trying to convert a sequence of objects into a DataTable, the following exception occurred:

I did some immediate research on possible causes for such an exception to be thrown. Microsofts KnowledgeBase indicated there might be a problem with the amount of local variables being allocated inside the injected method, however this was not the case since modelshredder uses only three local variables, regardless of the type of object. After some back and forth with the bug reporter, we were able to conduct a sample to reproduce the bug. Some trial and error with ShredderOptions including different subsets of members revealed, that the exception only occurred when the injected code tried to access an Indexer Property. The cause for this is pretty clear when taking a look at the MSIL generated for a property access.

ilgen.Emit(OpCodes.Ldloc_0);     // Load array on evaluation stack
ilgen.Emit(OpCodes.Ldc_I4_S, i); // Load array position on eval stack
ilgen.Emit(OpCodes.Ldarg_0);     // Load ourselves on the eval stack
ilgen.Emit(OpCodes.Call, pi.GetGetMethod());
// Check if we need to box a value type
if (pi.PropertyType.IsValueType)
    ilgen.Emit(OpCodes.Box, pi.PropertyType);

// Store value in array, this pops all fields from eval stack that were added this for loop

As you can see, the code expects the getter to be callable without any parameter, which is not the case if (PropertyInfo) pi.GetGetMethod() returns an indexer method. Since I can’t imagine there’s any use in representing the contents of an indexer property in tabular form, I decided to simply ban indexer properties from the ShredderOptions. To do so, I have added a validation inside the ShredderOptions constructor to check all PropertyInfos for Index parameters.

PropertyInfo pi = member as PropertyInfo; 

if (pi != null)


if (pi.GetIndexParameters().Length &gt; 0 ) 

throw new ArgumentException(&quot;May not contain indexer properties.&quot;, &quot;members&quot;);


Even though the fix was pretty easy once the cause was identified, bugs in MSIL injection are very hard to track down. The exception could point to any other part of the injected code being incorrect. I haven’t seen any effective way (or tool for that matter) to debug or review runtime injected code yet. It appears, one is pretty much left with nothing but trial and error in such cases.

Categories: Open Source, Projects
%d bloggers like this: