Large Scale with Elixir

When working with large-scale systems, developers can’t be bothered with low-level details, such as memory management and inter-process communication. These details should just work, leaving developers able to focus on high-level aspects of the overall system. Traditional languages don’t allow such higher level work. Consequently, they don’t scale as well and aren’t as cost effective.

Elixir is a language intended to be easy to learn. Since Elixir is built on Erlang, it gains all the benefits of Erlang. This also means they can be used interchangeably. Elixir can call into Erlang code with no overhead since it compiles to the same byte code. This gives developers access to all the existing Erlang code.

Erlang was designed to handle the large scale of the telecom system before the Internet existed. To do so, Erlang promotes concurrency, fault-tolerance, reliability, hot code swapping, and scalability. Because of Erlang’s early roots, it has become a mature, battle-tested language for large-scale systems.

The philosophy that drives Elixir/Erlang is “let it crash.” Human errors are inevitable. Eventually programs will crash. In order to have a truly reliable system, the system must be able to recover from these inevitable crises. Erlang handles this by breaking up the system into many processes. When a process dies abnormally, it is simply restarted and the system carries on as if nothing had happened. Consequently, Elixir/Erlang developers spend little time writing error handling code and more time writing valuable business logic.

There are other concurrent languages, such as Scala and Go. These languages often copy what Erlang has already achieved. However, Erlang has had longer to mature and is built from the ground up to be scalable.

For more information on Elixir check out the Elixir website and the presentation.

Posted in Blogroll | Leave a comment

Visions of Invision

At Rain, we’ve been using Invision increasingly to prototype our projects and communicate design proposals and decisions with our internal team as well as with clients. Invision provides a very simple and straightforward interface that speeds up the prototyping process, allowing us to “fail early, fail often”. We held a training recently to introduce Invision to the full agency staff. Feel free to check it out. It’s nothing fancy, but gives a pretty good overview of most of the features of Invision: Visions of Invision

We understand that we are designing an interaction, an experience, not just a product. Our websites and mobile apps are only as useful as a means to some end-user’s ends. Our objective in prototyping is to better understand the interaction between user and product, to understand how well our intended designs help those users achieve their purposes. By prototyping early we vet our initial ideas and can move on to better ones more quickly. Vetting happens best with feedback from the actual intended audience and using Invision makes gathering that feedback easier.

One of the best features of Invision is that it helps you progress through increasing fidelity on your designs in the same project. With Invision, we are able to get sketches up into a clickable prototype to test and experience on a computer or mobile device almost immediately. Replacing screens is simple, just drag and drop — or set up a folder on your desktop that will automatically sync updates to the images. We can use the same project from sketch to hi-fidelity design, gathering feedback and communicating ideas internally through Invision for the whole project lifecycle.

The Invision folks have been in active development and, so far, every improvement they’ve made has been, in my opinion, an improvement and not just bloat.

Invision has some limitations. In many cases, you can’t get the sense of some parts of an experience if your product has moving pieces (like sliders or animations), but for the speed of creating a clickable prototype (including many gestures/transitions for mobile), Invision is awesome.

Full disclosure: This post was unsolicited by Invision and I am not receiving any compensation for writing it. I don’t have any connection with Invision other than that Rain subscribes to the service.

Posted in Blogroll | Leave a comment

Gitflow and Semantic Versioning

Organizing a project is hard. You have all of these different personalities you have to bring together, all of these technologies you have to make sense of, and to top it all off, you need to make sure your client is getting your best code at any given point. There are numerous methods available to help with organizing your code, but with time and money on the line, it can be quite a daunting task.

Enter Gitflow.

Gitflow is a method of organizing your code based off a blog post from Vincent Driessen (http://nvie.com/posts/a-successful-git-branching-model/). It describes dividing the branches of your git repository into several different roles. These roles help keep code neat and tidy, and prevent issues of overwriting and conflicts.

The branches that Vincent outlines are as follows:

- Master branch: This is the main release branch, and should be the most polished code available. This branch is created via the next branch,

- Release Candidate branch: This branch contains code that is almost ready for release, but may need additional bugfixes before becoming ready for full release. This branch is based off the next type of branch,

- Develop branch: This is the current build for the application, and may contain more unstable code. These are also known as snapshot builds or nightlies, as they are updated on a daily basis.

Master, Release Candidate, and Develop branches should never be pushed into directly. To add new code to these static branches, you should use the working branches that Vincent describes:

- Hotfix branches: These are branches off master containing very specific bugfixes for the master environment. These are emergency creation branches, and should not be the normal method of merging in new code. That honor is bestowed upon:

- Feature branches: These are branches that are created by the individual developers on a per-feature basis. New unstable code is created and debugged here, and is pull requested into the develop branch. Using separate working and static branches allows the main code to remain as pure as possible, and free from the clutter the development process entails.

A handy tool designed to assist with this type of git usage is available via Hubflow (https://github.com/datasift/gitflow). Feel free to take a look at that tool for simplifying the process of utilizing gitflow in your daily workflow.

In connection with Gitflow, Semantic Versioning allows any user to see where in the development process an application is. It uses an X.Y.Z-BuildNo format, where X is a major release with breaking API changes, Y is a minor release with solely non-breaking feature additions, and Z is bugfixes designed to correct issues in previous features. The -BuildNo at the end is used to denote a snapshot version or release candidate, so the user is aware that it may be more unstable than normal releases.

As Gitflow and Semantic Versioning is followed on individual projects, it will solve many issues in communicating the current state of the application, and ensure that all the developers involved are working nicely together.

For a more in depth overview of this training, feel free to take a look at the source documentation at http://therebelrobot.github.io/HubFlow-Site/build/#

 

Posted in development, project management | Leave a comment

If you’re not learning, you’re sinking

We work in an industry that is always changing and growing. Once powerful tools have been rendered obsolete and new technology springs up seemingly out of nowhere and spreads like a virus. Though I’ve only worked in the tech world for a short period of time, I’ve learned one immutable truth: If we do not challenge ourselves to change as quickly as the industry, we ourselves also become obsolete.

When I started out in QA I was only just scratching the surface. New to me were the terms API, git, and agile. As I started to get into things I quickly realized that there’s always something to learn and gain a deeper understanding for. It was then that I challenged myself to begin to learn something the moment I started to feel comfortable with how things were going.

Eventually I began to simulate test cases, query databases, use git to pull new code and even commit changes of my own.

Sometimes we begin to take it for granted but we stand shoulder to shoulder with exceptionally talented people. Many of them are open to share their knowledge and experience with those who merely need to ask. Being able to pick the brains of those around you helps not only to overcome any current obstacles, but gives valuable insight to any problems that they’ve encountered and lessons they’ve learned themselves so you don’t make the same mistakes.

The internet is also a valuable tool for learning. You can quite literally teach yourself anything you need to know. These TED Talks focus on the theme of learning and how we learn. If you have some time, I recommend looking at some of the stories people share. There is no one specific way to go about learning more. Some stories stress the value of self learning, some the value of collaboration. One in particular talks about setting a goal and trying something new for 30 days. It can be a simple goal or habit you want to form. Such as getting back into the gym or studying something new. That doesn’t seem to hard, does it? It might at first, but eventually it will become routine.

It’s been a year since I began and I am light years ahead of where I started because of my desire to learn about something, even if just a little bit, each day. I’m going to make it my new years resolution to feel the same way about my progress this time next year. Sometimes it’s hard to take what little free time you might have after a long day to learn something new, but just committing to it, even if it’s just 30 minutes a day, will make you more valuable every day. Constantly putting yourself outside of  your comfort zone is when we grow the most.

 

 

 

Posted in employees | Leave a comment

Quick Tip For Creating a Cohesive Color Palette

Creating color palettes can be hard. Designers spend a lot of time trying to create cohesion between the various colors on the rainbow. A while back I learned a quick trick to help your color palette feel like a family and it only takes a few extra steps:

Step 1: Go ahead and pick the colors you’ll be using. I chose chartreuse, green, pink & mint. Don’t worry too much about making them feel like a family just yet, just pick a few colors that you like.

Step 2: Choose another color that you want to tie all the colors together. For example if you want a warm tone to the colors you choose a warm color like yellow-orange.

Step 3: Use a soft light effect on the yellow-orange color to overlay your color palette. You will see the colors immediately change, in most cases it will be too drastic. Now go ahead and play with the opacity until you get the tones you want.

Voila! You can see from the example that you can overlay different colors to your original palette to create cohesion in your colors.

Posted in creative, design | Leave a comment