Running MCV 3 applications with Xamarin Studio/MonoDevelop

One of the things i often seen asked on StackOverflow and in other places related to Xamarin Studio and MonoDevelop is that there is some problems with the ASP.NET Webstack and references to various required assemblies. Typically people get errors like:

Could not load file or assembly 'System.Web.WebPages, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' or one of its dependencies. The system cannot find the file specified.

or

CS1705: Assembly `System.Web.Mvc, Version=3.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' references `System.Web.WebPages, Version=2.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' which has a higher version number than imported assembly `System.Web.WebPages, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35'

This is a quick guide to get up and running with an MVC3 project with Razor support in Xamarin Studio (instructions should apply for MonoDevelop aswell).

Basically nuget is to the rescue so the first thing to do is to install the NuGet Add-in to Xamarin Studio:

Open the AddIn Manager

Open the AddIn Manager

Select Gallery tab and choose Manage Repositories

Select Gallery tab and choose Manage Repositories

Add the repository for the unofficial Nuget Add-in http://mrward.github.com/monodevelop-nuget-addin-repository/4.0/main.mrep

Add the repository for the unofficial Nuget Add-in http://mrward.github.com/monodevelop-nuget-addin-repository/4.0/main.mrep

Install the nuget package from the newly added repository

Install the nuget package from the newly added repository

This should get you up and running otherwise there is more specific instructions on https://github.com/mrward/monodevelop-nuget-addin. If you are running a unstable version of Xamarin Studio (i.e. Beta or Alpha channel) you might have to build the addin from source and update the Dependencies in src/MonoDevelop.PackageManagement/MonoDevelop.PackageManagement.addin.xml from:

 
	
	
  

To:

 
	
	

After that you can just build it using xbuild on the commandline and generate a plugin file using:

	mono mdtool.exe MonoDevelop.PackageManagement.dll

This should give you an mpack file you can install from the Add-In manager by choosing install from file

With NuGet installed you can create and MVC 3 project in Xamarin Studio like you normally would do:

Create the MVC 3 Project

After that the first thing that will strike you as odd is that you will probably be missing some references that are not present on your system depending on your exact version of Mono (im running 3.0.10 but same problems on 2.10.* as far as i know). So first of lets go and remove the two failing references

There are two missing references

After that we open the NuGet Package Manager:

Nuget Package Manager

And search for the package Microsoft.AspNet.WebPages

The Microsoft.AspNet.WebPages package

And press the Add button to add it to our solution. And thats it, you are now running ASP.NET MVC 3 in Xamarin Studio. Big kudos goes to the SharpDevelop team and Matt Ward for bringing NuGet to MonoDevelop/Xamarin Studio.

Posted on 24 Jun 2013 by Jakob T. Andersen

Mono: Internal compiler error

My current half hobby and half business project involves me building and running the OSS CMS Umbraco on Mono. The building part is not strictly neccesary so i guess that is the hobby part, the running Umbraco on mono however is something that one of my clients would like to do. This is not the first time that I take a piece of "of-the-shelf" .NET software and try to build and run it on mono, previously I have build amongts others nHibernate using Mono (and MonoDevelop). A few timer I have encountered the not so helpful error: "Internal compiler error" followed by different error messages. In the Umbraco project I encountered it on the form:

Internal compiler error: Method not found: 'UmbracoExamine.ContentExtensions.ToXDocument'.

It was followed by a related error stating that an invalid typeconversion was tried in the same line as the above error is reported, there is multiple ways to figure out what is going on, one is from MonoDevelop by poking around with the built-in Assembly Browser and observe if the method mentioned in the error is present and if the current project actually references the assembly and all its references are satisfied. However there is an easier way (at least if you ask me). When building in MonoDevelop the build output actually prints the actual compiler commandline used to build each project in the solution, its a massive string so i shortened it, on my OS X system it looks like this:

/Library/Frameworks/Mono.framework/Versions/2.10.9/bin/dmcs /noconfig "/out:/Users/jta/Projects/umbracov4/umbraco_a7f6b8c00dd8/src/umbraco.presentation/bin/umbraco.dll" .............

As you can see its a call to the dmcs compiler that builds the "umbraco.dll" after that follows a list of all files in the project and all references. If we copy this to a shell and set the MONO_LOG_LEVEL environment variable to debug we will get a load of information about what the compiler is actually doing and what our "Internal compiler error" actually is.

export MONO_LOG_LEVEL=debug; /Library/Framework/Mono.framework/Versions/2.10.9/bin/dmcs ........

The result of this might seem like an overload of information but if you scroll around and find the actual error message from before, then pay attention to the parts above it. In our case it will detail that the compiler has problems with loading assemblies that are required and hence it cannot find the extension method we are using.

The detailed debug information from the C# compiler in mono

The reason this happens is that the relationship between Umbraco and Examine (or namely UmbracoExamine) has a weird structure that results in circular references, so basically we have to copy dll's around between the projects to make it all work. There should be work in progress to remove these circular references in the umbraco source which sound great as copying DLL's around shouldn't be neccesary.

To actually make the umbraco source compile i will have to copy like this for a number of the assemblies that UmbracoExamine references:

cp src/umbraco.cms/bin/Debug/cms.dll lib/cms.dll
Posted on 07 Jul 2012 by Jakob T. Andersen

ASP.NET MVC JsonResult on Mono and the JavaScriptSerializer

I have been doing a little work on an ASP.NET MVC solution on Visual Studio and Windows and then a little later i moved it to MonoDevelop on OS X to continue development. Opening the Visual Studio solution file in MonoDevelop just works and compiling the code has no issues, however running it I got the following exception:

Cannot cast from source type to destination type.

System.InvalidCastException: Cannot cast from source type to destination type.
  at System.Web.Script.Serialization.JavaScriptSerializer..ctor (System.Web.Script.Serialization.JavaScriptTypeResolver resolver, Boolean registerConverters) [0x0000d] in /private/tmp/monobuild/build/BUILD/mono-2.10.8/mcs/class/System.Web.Extensions/System.Web.Script.Serialization/JavaScriptSerializer.cs:66 
  at System.Web.Script.Serialization.JavaScriptSerializer..cctor () [0x00000] in <filename unknown>:0 

After a little investigation I found that it is mixed up DLL's causing this. You can see on the screenshot below two versions of System.Web.Extensions.dll is loaded both 3.5 and 4.0.

System.Web.Extensions loaded twice in two different versions

The reason for this seems to be that the System.Web.Mvc.dll is compiled referencing the 3.5 version and not the 4.0 version so we can easily fix this with a redirect using assemblyBinding element in our web.config:

<runtime>
	<assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1">
		<dependentAssembly>
			<assemblyIdentity name="System.Web.Extensions" culture="neutral" publicKeyToken="31bf3856ad364e35" />
			<bindingRedirect oldVersion="3.5.0.0" newVersion="4.0.0.0"/>
		</dependentAssembly>
	</assemblyBinding>
</runtime>

And that should make you able to proceed with your ASP.NET MVC development with Mono

UPDATE: This is already reported as a bug describing the workaround aswell: https://bugzilla.novell.com/show_bug.cgi?id=664813

Posted on 18 Jan 2012 by Jakob T. Andersen

Generating static blog using Mono's LameBlog

This blog is generated using Mono LameBlog a tool created by Miguel de Icaza to run his own blog. I stumpled upon it in the mono source tree a couple of years back when i was doing some work with Mono. Without offending anyone I think you can say that LameBlog is not a polished piece of software, but it is actually very convinient for a couple of reasons:

  • Blogentries in your favorite VCS
  • Easy offline editing with your favorite editor
  • Simple hosting setup, only needs to support basic HTML
  • Scalable as it is only HTML

It works by generating the whole site structure from a predefined format of entries stored on your local disk, to do this it uses a template and a configuration file which you can use to tweak apperance (I didn't do this much as you can see). Furthermore the deployment process is hidden in the makefile of the projects and utilizes rsync to push changes to the server.

So to sum up the workflow this it how it looks like when i write a blog post:

vim ~/Activity/2012/jan-16.html
#Write the blog post using HTML and small #-prefixed lines for imagehandling etc
cd Projects/lb
make push

Note that i have my posts stored in VCS but commands not included above.

To make the interactive features of a common blog like commenting and search work third party providers is used, namely a Disqus and a Custom Google Search, its a little bit of an experiment and primarily aimed at making blogging so convenient as possible for me so i might do it more often, but we have to see.

Currently this site is hosted on a free-usuage-tier EC2 micro instance at Amazon with an nginx webserver

Posted on 16 Jan 2012 by Jakob T. Andersen

Writing a calculator in C# using Irony

My friend Mark wrote a post last week about implementing a calculating compiler using SableCC. I have worked with SableCC on a project where we needed to parse C# code and I have to admit I don't recall SableCC as a friendly framework, and reading Mark's post proves my memory right.

First of Mark's grammar is very specific to SableCC, a lot af tricks is made to make SableCC accept the grammar and furthermore the grammar is complicated further by handling operator precedence. If we disregard performance considerations and focus on an easily understandable and maintainable compiler structure our grammar for our parser generator ideally would look very much like the Extended Backus Naur Format (EBNF) of the language.

number = digit+ | digit+ '.' digit+
lparen = '('
rparen = ')'
binop = '+' | '-' | '/' | '*' | '%'
funcname = letter+
expression := lparen expression rparen | expression operator expression | funcname lparen expression rparen | number

Irony is like SableCC a scanner/parser generator but it is implemented as an internal DSL in C# for specifying grammars on a format very close to EBNF. The code is specified in the constructor body of a class inherting from Irony's Grammar class. The EBNF described above (slightly modified for readability) looks like this:

public class SimpleCalcGrammar : Irony.Parsing.Grammar{
    public SimpleCalcGrammar(){
        var number = TerminalFactory.CreateCSharpNumber("number");
        var identifier = TerminalFactory.CreateCSharpIdentifier("identifier");
        var expression = new NonTerminal("expression");
        var binexpr = new NonTerminal("binexpr");
        var parexpr = new NonTerminal("parexpr");
        var fncall = new NonTerminal("fncall");
        var binop = new NonTerminal("binop");
            
        expression.Rule =  parexpr | binexpr | number | fncall;
        parexpr.Rule = "(" + expression + ")";
        binexpr.Rule = expression + binop + expression;
        binop.Rule = Symbol("+") | "-" | "/" | "*" | "%";
        fncall.Rule = identifier + "(" + expression + ")";
        this.Root = expression;
        //...
    }
}

The first few lines are initialization of our Terminals and Productions, this is pretty trivial however a few things is worth noting. Irony is a very rich framework that provides a lot of prebaked functionality, one of these is the TerminalFactory for creating typically used terminals, in our case we use C#'s format for numbers and identifiers, that means we out of the bag get support in our language for suffix on numbers for specifying types etc.

The interresting part is after initialization, here we set up rules for the productions. Irony relies heavily on operator overloading so using our productions and the operators from EBNF we can describe the language pretty straight forward. Lastly we specify what productions is the root of our program, that is the production that contains the entry point for the whole language.

Next up we need to handle operator precedence, fortunately Irony knows that this is a common challenge when developing languages and hence it supports registering our operators and their precedence using the following code:

RegisterOperators(1, "+", "-");
RegisterOperators(2, "*", "/", "%");

Having specified this we have a working parser that can be used like this:

SimpleCalcGrammar g = new SimpleCalcGrammar();
Parser p = new Parser(g);
ParseTree t = p.Parse("25-37+2*(1.22+cos(5))*sin(5)*2+5%2*3*sqrt(5+2)");

This gives us a parse tree that is pretty rough in the edges and hard to work with, what we want instead is a pretty Abstract Syntax Tree that contains exactly the information we need. On our NonTerminal instances we can specify which nodes they should be transformed into, fortunately Irony contains some of the basic nodes that are needed in many languages for instance the binary expression node, so we can specify the built-in BinExprNode on our binexpr NonTerminal like this:

var binexpr = new NonTerminal("binexpr", typeof(BinExprNode));

For us to parse the language we need one aditional node in our AST, that is the FunctionCall node (the observant reader might have noticed i have choosen to implement an openended grammar that allows for extending with new built-in functions without modifying the grammar). Irony has a AST node for function calls built-in but for the sake of the example i will show a custom implementation here:

public class FunctionCallNode : AstNode{
    public string Name;
    public AstNode Argument;
    public override void Init (Irony.Parsing.ParsingContext context, Irony.Parsing.ParseTreeNode treeNode)
    {
        base.Init (context, treeNode);
        Name = treeNode.ChildNodes[0].FindTokenAndGetText(); 
        Argument = AddChild("Arg", treeNode.ChildNodes[1]);
        AsString = Name; 
    }
}
//Adding the node to our function call node
var fncall = new NonTerminal("fncall", typeof(FunctionCallNode));

So implementing our own AST nodes is pretty simple, and Irony has a rich infrastructure for getting text of tokens and adding childnodes as you can see in the sample. Because our parse tree contains a lot of "garbage" that we don't want transferred to our AST, so we need to inform Irony to import these, this is done by marking nonterminals as transient like this:

MarkTransient(parexpr, expression);

And at last we need to set a flag so that the parser generates the Ast using the following line of code:

LanguageFlags = LanguageFlags.CreateAst;

The next step is to actually work with the AST for sementic analysis, codegeneration or as in our case execution of the calculation. Mark did this using a visitor and Irony's AST supports visitors aswell, so we can make the "PrintVisitor" mark used for debugging like this:

public class PrintVisitor : IAstVisitor{
    int indentation = 0;
    public void BeginVisit (AstNode node)
    {
        for (int i = 0;i<indentation ;i++ ) {
            Console.Write("\t");
        }
        Console.WriteLine(node.ToString());
        indentation++;
    }
    
    public void EndVisit (AstNode node)
    {
        indentation--;
    }    
}

Using a visitor looks like this:

SimpleCalcGrammar g = new SimpleCalcGrammar();
Parser p = new Parser(g);
ParseTree t = p.Parse("25-37+2*(1.22+cos(5))*sin(5)*2+5%2*3*sqrt(5+2)");
var astnode = (AstNode)t.Root.AstNode
astnode.AcceptVisitor(new PrintVisitor());

And the result of this operation is:

+(operator)
    Arg: +(operator)
        Arg: -(operator)
            Arg:25
            Arg:37
        Arg: *(operator)
            Arg: *(operator)
                Arg: *(operator)
                    Arg:2
                    Arg: +(operator)
                        Arg:1.22
                        Arg: cos
                            Arg:5
                Arg: sin
                    Arg:5
            Arg:2
    Arg: *(operator)
        Arg: *(operator)
            Arg: %(operator)
                Arg:5
                Arg:2
            Arg:3
        Arg: sqrt
            Arg: +(operator)
                Arg:5
                Arg:2

However, there is no need for us to create a printvisitor, Irony comes with an UI that allows us to work with our grammar and can parse samples and show us AST, parsertrace with parserstatec and much more, this is very useful to see what the grammar actually does under the covers (shift/reduce). But basically we can go ahead and implement pretty much the same visitor as Mark have used for SableCC, but let's do something different. Irony has support for a basic LanguageRuntime, and actually for a basic interpreter aswell. So if we tell our AST-nodes how they should evaluate themselves we get all the infrastructure given to us by Irony!

So lets override the Evaluate method on our FunctionCallNode so it can be used in Irony's interpreter infrastructure:

public class FunctionCallNode : AstNode{
    //....
    public override void Evaluate (Irony.Interpreter.EvaluationContext context, Irony.Ast.AstMode mode)
    {
        Argument.Evaluate(context, AstMode.Read); //Evaluate the argument the result is saved in context.Data
        double input = Convert.ToDouble(context.Data[0]);
        double result;
        if(Name == "sqrt"){
            result = Math.Sqrt(input);
        }else if(Name == "cos"){
            result = Math.Cos(input);    
        }else if(Name == "sin"){
            result = Math.Sin(input);    
        }else{
            throw new NotSupportedException("Method " + Name + " not supported");    
        }
        context.Data.Replace(1, result); //Replace the argument value on the stack with the result of the function call
    }
}

Yes, in real life our methods would be in a method table and not hardcoded here, this is just to demonstrate the functionality of the interpreter in irony. To use the Interpreter we write code like this:

var interpreter = new Irony.Interpreter.ScriptInterpreter(new LanguageData(new SimpleCalcGrammar()));
interpreter.Evaluate("25-37+2*(1.22+cos(5))*sin(5)*2+5%2*3*sqrt(5+2)");
Console.WriteLine(interpreter.EvaluationContext.LastResult);

Voila, done with the code that corresponds to Mark's sample and the result of running the above is: -9.83033874894108 and the built-in Irony interpreter handle almost everything for us leaving me with only 57 lines of pure C# code:

public class FunctionCallNode : AstNode{
    public string Name;
    public AstNode Argument;
    public override void Init (Irony.Parsing.ParsingContext context, Irony.Parsing.ParseTreeNode treeNode)
    {
        base.Init (context, treeNode);
        Name = treeNode.ChildNodes[0].FindTokenAndGetText();
        Argument = AddChild("Arg", treeNode.ChildNodes[1]);
        AsString = Name;
    }
    
    public override void Evaluate (Irony.Interpreter.EvaluationContext context, Irony.Ast.AstMode mode)
    {
        Argument.Evaluate(context, AstMode.Read);
        double input = Convert.ToDouble(context.Data[0]);
        double result;
        if(Name == "sqrt"){
            result = Math.Sqrt(input);
        }else if(Name == "cos"){
            result = Math.Cos(input);    
        }else if(Name == "sin"){
            result = Math.Sin(input);    
        }else{
            throw new NotSupportedException("Method " + Name + " not supported");    
        }
        context.Data.Replace(1, result);
    }
}
public class SimpleCalcGrammar : Irony.Parsing.Grammar{
    public SimpleCalcGrammar(){
        
        var number = TerminalFactory.CreateCSharpNumber("number");
        var identifier = TerminalFactory.CreateCSharpIdentifier("identifier");
        var expression = new NonTerminal("expression");
        var binexpr = new NonTerminal("binexpr", typeof(BinExprNode));
        var parexpr = new NonTerminal("parexpr");
        var fncall = new NonTerminal("fncall", typeof(FunctionCallNode));
        var binop = new NonTerminal("binop", "operator");
        
        expression.Rule =  parexpr | binexpr | number | fncall;
        parexpr.Rule = "(" + expression + ")";
        binexpr.Rule = expression + binop + expression;
        binop.Rule = Symbol("+") | "-" | "/" | "*" | "%";
        fncall.Rule = identifier + "(" + expression + ")";
        
        RegisterPunctuation("(",")");
        RegisterOperators(1, "+", "-");
        RegisterOperators(2, "*", "/", "%");
        
        
        MarkTransient(parexpr, expression);
        
        this.Root = expression;
        this.LanguageFlags = LanguageFlags.CreateAst;
    }
}
Posted on 07 Oct 2009 by Jakob T. Andersen
Older entries »