Our polyglot approach: Getting started with Rust

by Dan Persa - 20 May 2016

I recently started using Rust – the programming language. My team had thought about the idea of using a polyglot approach when building services – we think that we should always use the right tool for the job. We also believe that we should build our services so that others can use them, thus, while prototyping, we’ve built projects in many programming languages as part of migrating our shop monolith to microservices.

I already hear you asking: What projects have you worked on? We have Skipper, built with Go. My colleague Arpad wrote a blog post about it. I also did some tech talks about Innkeeper, a reactive RESTful API we wrote in Scala, using the Akka HTTP and Slick frameworks. Some of my colleagues played around with Elm and built a game in just five days during Zalando’s Hack Week. Elm is a functional language similar to Haskell, built on top of JavaScript. And we have a layout service in Node, Tailor.

It was just a matter of time before I started experimenting with Rust. I started slow, building a mock service for our OAuth and including it in our CI for Innkeeper, as a Docker image. In this post I’ll talk about my experience on getting started with Rust, with a second post to follow that explains how to include it in a Docker image.

Why Rust?

After listening to some talks on Rust, it immediately got my interest. The things I instantly liked about it are

  • The fast compilation (I’m working with Scala right now and the compilation can sometimes take too much time)
  • Being memory safe and race-conditions safe by default, without the need of a garbage collector – With a little extra effort from the developer, of course
  • Pattern matching: Once you get used to it, it’s hard to go back to languages without it
  • No need for a Virtual Machine – now that we have Docker, having the same code running on different machines isn’t as important as it was a while ago.

First Steps

The first service I built with Rust was a JSON API, called rusty-oauth. To start a new project in Rust, you have to install Cargo, which is a build tool for Rust. Cargo helps you to:

  • Initialise new projects
  • Build, release, run, and test your projects
  • Declare external dependencies (called crates) for your project (a Rust Crate is like a Java jar or Ruby Gem)
cargo new --bin rusty-oauth

The above command will create a new ‘hello world’ app for you. Use cargo run inside the directory to compile and run the app.

The Rusty OAuth Service

I now want to go through the code of this project and explain some of the most important concepts of Rust while doing so. I won’t cover all of Rust’s features, but I’ll cover enough to make those of you considering Rust a little curious.

Let’s dive into the existing code. First of all the main file:

extern crate rustc_serialize;
#[macro_use] extern crate log;
extern crate env_logger;
#[macro_use] extern crate nickel;
mod token_info;
use nickel::{Nickel, MediaType, HttpRouter, QueryString};
use nickel::status::StatusCode::BadRequest;
use rustc_serialize::json;
use token_info::TokenInfo;

In order to be able to use crates from outside of your project, you need to use the extern crate construction. In our case, we use the rustc_serialize crate, the log crate, the env_logger crate, and the nickel crate.

We then use the mod keyword to define a new module. A module is a collection of items: Functions, structs, traits, impl blocks, and other modules.

We use the use keyword to import functions, structs, and traits from other modules which we’d like to use in our current file. In this case, we import from the nickel crate:

fn main() {
let mut server = Nickel::new();
info!("Welcome to rusty-oauth");
server.get("/oauth2/tokeninfo", middleware! { |req, mut res|
let token = match req.query().get("access_token") {
Some(token) => token.to_string(),
None => {
return res.send(invalid_request(ACCESS_TOKEN_INVALID))
debug!("Request token: {:?}", token);
let token_info = match TokenInfo::from_query_param(&token) {
Ok(token_info) => token_info,
Err(err) => {
return res.send(invalid_request(err));
debug!("Token info: {:?}", token_info);

It’s time to look at the main function. We start by initialising the logger:

 env_logger::init().unwrap(); . 

In Rust, most functions return a Result. A Result is a simple enum, with two possible values: Ok or Err.

enum Result<R, E> {

There are two ways of extracting the value from a Result. The first (and unsafe) way is using the unwrap function, as you can see above with the env_logger. If there’s an error, the unwrap function “panics”, unwinding the stack for the current thread (while calling destructors for each of the resources owned by the stack). As our program has only one thread, it will exit with an error message.

The safe way to extract the value is by using pattern matching (and we have an example at line 19). What we are doing here is treating both cases. In case of a success, we return the token_info. In the case of failure, we return a BadRequest back to the user.

As we can see from the definition of the Result enum, Rust also supports generics:

fn invalid_request<S: Into<String>>(err: S) -> String {
format!("{{\"error\":\"invalid_request\",\"error_description\":\"{}\"}}", err.into())

Next we look at how to define a function in Rust. By default, functions are private to the module. By using the pub keyword, we’re able to make a function public.

By skipping the semicolon (;) at the end of the line, you’re able to tell the compiler that you have an expression there. In our case, as we skipped the ; and the type of expression matches the return type of the function, we’re able to also skip the return keyword from our function:

pub type Scope = String;
pub type Realm = String;
pub type Uid = String;
pub struct TokenInfo {
scopes: Vec<Scope>,
realm: Realm,
uid: Option<Uid>

Here, we can define some public type aliases. As I mentioned earlier, everything is private as long as you don’t use the pub keyword, and I find this to be quite a good language design decision.

We are also defining a struct, the TokenInfo. As the uid is optional, we use the Option<T> trait to express this:

impl TokenInfo {
fn new(scopes: Vec<&str>, uid: Option<Uid>, realm: &str) -> TokenInfo {
let s = scopes.iter().map(|s| s.to_string()).collect();
TokenInfo { scopes: s, realm: realm.to_string(), uid: uid }
pub fn from_query_param(param: &str) -> Result<TokenInfo, String> {
let parts: Vec<&str> = param.split("-").collect();
if parts[0] != "token" {
return Err(format!("{} {}", TOKEN_START_ERR, TOKEN_FORMAT));
let token_info = match parts.len() {
1 => {
warn!("{}", TOKEN_MISSING_UID);
TokenInfo::new(vec![], None, "")
2 => {
TokenInfo::new(vec![], create_uid(parts[1]), "")
3 =>{
TokenInfo::new(vec![], create_uid(parts[1]), parts[2])
_ => {
let v = parts.clone().split_off(3);
TokenInfo::new(v, create_uid(parts[1]), parts[2])

Using the impl keyword, we implement two functions for the TokenInfo struct. After we define these functions, we’ll be able to call them using: TokenInfo::new(...) and TokenInfo::from_query_param(...). These operate like static functions in Java. In order to define the methods, we have to provide the “self” as a first parameter (see the next snippet). We’ll then be able to call the methods using an instance instead of the name of the struct: my_token_info.encode(...).

In this new function, we’re able to see how to use the map method to transform one collection type into another:

impl Encodable for TokenInfo {
fn encode<S: Encoder>(&self, encoder: &mut S) -> Result<(), S::Error> {
encoder.emit_struct("TokenInfo", 1, |encoder| {
try!(encoder.emit_struct_field( "scope", 0, |encoder| self.scopes.encode(encoder)));
try!(encoder.emit_struct_field( "realm", 1, |encoder| self.realm.encode(encoder)));
if self.uid.is_some() {
try!(encoder.emit_struct_field( "uid", 2, |encoder| self.uid.encode(encoder)));

Here we are implementing the Encodable trait for our structure. The goal is to be able to transform our structure into JSON. We can also see how to define a method, using the &self as a first parameter:

mod tests {
use super::TokenInfo;
use rustc_serialize::json;
fn token_info_new_test() {
let token_info = &TokenInfo::new(vec!["read", "write"], None, "/employees");
assert_eq!("{\"scope\":[\"read\",\"write\"],\"realm\":\"/employees\"}", json::encode(token_info).unwrap());
fn token_info_from_token_param_fail_test() {
let token_info_err = TokenInfo::from_query_param("bla-/employees-read-write").err().unwrap();
assert_eq!(format!("{} {}", TOKEN_START_ERR, TOKEN_FORMAT), token_info_err);

The idiomatic way of writing unit tests in Rust is used by defining a submodule in the same file as the production code. The compiler will ensure that tests aren’t included in the release. As tests is a submodule, we need to import the TokenInfo module using use super::TokenInfo;.

Rust has a powerful macro definition engine. All of the functions which end with ! are macros. For our purposes, we use the assert_eq! macro to panic in case the two arguments aren’t equal.

I hope you found my mini-dive into Rust fascinating enough to give it a try yourself. In the second part of this post, I’ll go further into detail about how to put a Rust JSON API into a 5MB Docker Image.

You can contact me on Twitter @danpersa if you have any further questions. Thanks for reading!

Similar blog posts