Blog

Functional Programming in Scala

14 Aug, 2018
Xebia Background Header Wave

In the previous article about Functional Programming in Python,
I explained that Functional programming (FP) is a paradigm where a program is composed of functions. Solving problems in
a functional way results in simple but powerful processing pipelines. Create pipelines using Functional Data Structures (FDS),
Higher Order Functions (HoF) and functions. Pipelines are a chain of functions that always return a value.
This time we’ll look how to create processing pipelines using Scala, a Functional/Object Oriented
programming language created by Martin Odersky at EPFL,
a world leading university in Lausanne, Switzerland.

Functional/Object Oriented

What does Functional/Object Oriented (FOO) mean? Both Functional Programming and Object Oriented (OO) Programming are paradigms.
FP uses the function as its primary building block. Larger problems can broken down into small functions that
can be composed to solve a larger problem. How to design applications using functions is described in the book
Functional Programming in Scala by Paul Chiusano and Runar Bjarnason.
With OO the primary building block is an object. An object has state and behavior. Larger problems can be broken down into
small objects. Objects communicate with each other by means of sending messages. How to design applications using objects
is described in the book Design Patterns: Elements of Reusable Object-Oriented Software by
Erich Gamma, Richard Helm, Ralph Johnson and John Vlissides. The authors of the book are often referred as the Gang of Four, abbreviated to GoF.
Most OO programming languages support FP. This is because a function can be expressed as an object with an .apply() method that returns a value.
To reduce the amount of ceremony and characters to type, it is preferable to use a programming language that supports function literals like Java,
Python, Groovy, Typescript,
JavaScript, Go or Scala.
Scala supports both OO and FP because it provides FDS structures – a data structure with higher order behavior.
FDS structures are also functions themselves so they can be composed. Composing objects in a functional way allows
for new design patterns eg. processing pipelines. Lets take a look how that works in Scala!

Functions and Scala

Scala uss the arrow-syntax => to define a function. For example, when we type val f = (x: Int) => x + 1 we can define
a function called f, that when applied with a value, returns x + 1.

@ val f = (x: Int) => x + 1
f: Int => Int

@ f(1)
res1: Int = 2

// a shorter syntax
@ val f = (_: Int) + 1
f: Int => Int

@ f(1)
res2: Int = 2

Function Composition

Function composition is about combining functions. A combined function has the computational properties of both. For example, when we define two functions f and g, we can compose them to function h, where h has the properties of both.

@ val f = (_: Int) + 1
f: Int => Int

@ f(1)
res1: Int = 2

@ val g = (_: Int) + 2
g: Int => Int

@ val h = f compose g
h: Int => Int

@ h(1)
res2: Int = 4

Pure and Impure Values

FP starts to shine when it operates on values that express a certain type of value. Most programs operate on pure values. Examples of pure values are the number 1, the text hello, or a value like true or false. FP allows to express the result of a computation as a value. Such a value is impure because it expresses an effect of a computation. Examples of impure values are Success(123), Failure('First name is empty'), Some(1) or None.

@ import scala.util._
import scala.util._

@ Success(123)
res1: Success[Int] = Success(123)

@ Failure(new RuntimeException("First name is empty"))
res2: Failure[Nothing] = Failure(java.lang.RuntimeException: First name is empty)

@ Some(1)
res3: Some[Int] = Some(1)

@ None
res4: None.type = None

Functional Data Structures

Scala has a lot of FDS structures built in like Option, Try. Like functions, an FDS can be composed which means we can create a FDS that has the behavior of both. Scala uses the name flatMap to compose two FDS structures.

@ import scala.util._
import scala.util._

@ val x = Option(1)
x: Option[Int] = Some(1)

@ Option(1).map(_ + 1)
res1: Option[Int] = Some(2)

@ Option(2).flatMap(x => y.map(_ + x))
res2: Option[Int] = Some(4)

@ Option.empty[Int].flatMap(x => y.map(_ + x))
res3: Option[Int] = None

@ Try(1)
res4: Try[Int] = Success(1)

@ Try(1/0)
res5: Try[Int] = Failure(java.lang.ArithmeticException: / by zero)

The basic premise of FP is, if we can decompose our problem space into custom FDS structures, when we compose them we get our full application. This pattern is described in the book Functional Reactive Domain Modeling by Debasish Ghosh.

Higher Order Functions

Before we can create pipelines, we need to look at higher order functions (HoF). HoF are functions, that receive a function as an argument or return a function as the result. A simple concept that we already know from function composition.

@ val f = (_: Int) + 1
f: Int => Int

@ val g = (_: Int) + 2
g: Int => Int

@ val h = (x: Int => Int, y: Int => Int) => x compose y
h: (Int => Int, Int => Int) => Int => Int

@ val i = h(f, g)
i: Int => Int

@ i(1)
res1: Int = 4

List Operations

A List is also a FDS structure. It expresses the effect of having zero or more elements. The list supports operations that accept a function as an argument, but in itself is also a function.

@ List(1, 2, 3).map(_ + 1)
res0: List[Int] = List(2, 3, 4)

@ List(1, 2, 3).filter(_ > 1)
res1: List[Int] = List(2, 3)

@ List(1, 2, 3).flatMap(x => List(x + 1, x + 2, x + 3))
res2: List[Int] = List(2, 3, 4, 3, 4, 5, 4, 5, 6)

@ List.empty[Int].map(_ + 1)
res3: List[Int] = List()

@ List(1, 2, 3).fold(0)(_ + _)
res4: Int = 6

@ List(1, 2, 3).sum
res5: Int = 6

@ List("the", "book", "is", "green").mkString(",")
res6: String = "the,book,is,green"

@ List(1, 2, 3).headOption
res7: Option[Int] = Some(1)

@ List(1, 2, 3).partition(_ > 1)
res8: (List[Int], List[Int]) = (List(2, 3), List(1))

// a list is also a function as it can be applied
@ List(1, 2, 3).apply(0)
res9: Int = 1

The List FDS always returns a value. Depending on the pipeline we create, the List will return an impure value or a pure value. To create a pure value use fold or mkString. To return impure values use map, flatMap or headOption.

A List of Option Values

When we have a list of impure values, we can create a pipeline to combine the impure values. The result is also an impure value.

// notice the list contains the values 1, 2, and 3
@ val xs = List(Option(1), Option(2), Option.empty[Int], Option(3)).flatten
res1: List[Int] = List(1, 2, 3)

@ xs.sum
res2: Int = 6

For example, a List of Option, written as List[Option] expresses a list of optional values. This list could be the result
of validating user input. The third user input is empty, but we want to process further, so we use the flatten operation
to remove the effect of the optional value. We now have a list of pure values that we can operate on.

A List of Effects

Sometimes we want to know the overall effect of a list of impure values. The operation we use is called sequence, but
this concept is not supported by the Scala standard library. We can add the operation to Scala by importing a library
called scalaz.

@ import $ivy.<code>org.scalaz::scalaz-core:7.2.7, scalaz._, Scalaz._

@ val xs = List(Option(1), Option(2), Option.empty[Int], Option(3))
xs: List[Option[Int]] = List(Some(1), Some(2), None, Some(3))

@ xs.sequence
res1: Option[List[Int]] = None

@ val xs = List(Option(1), Option(2),Option(3))
xs: List[Option[Int]] = List(Some(1), Some(2), Some(3))

@ val ys = xs.sequence
ys: Option[List[Int]] = Some(List(1, 2, 3))

@ ys.fold(0)(_.sum)
res6: Int = 6

When we have a list of optional values, we want to know if there is a missing value, and if so, we don’t want
to process further. The sequence operation on the List gives us the value None is there is a missing value.
When all values are present, it returns the value Some(List(1, 2, 3)) With the fold operation, I choose to return
a zero when the list is empty, or the sum of all values when the list is not empty.

Validating Values

The Validation FDS is great for validating user input. When combined with Option, we can express missing values.
Combined with Regex, it validates user input. The Scala standard library does not support validating user input.
We can add Validation to Scala by importing a library called scalaz.

@ import $ivy.<code>org.scalaz::scalaz-core:7.2.7, scalaz._, Scalaz._

@ import scala.util.matching._
import scala.util.matching._

@ Option.empty[Int].toSuccessNel("No value")
res1: ValidationNel[String, Int] = Failure(NonEmpty[No value])

@ Option(1).toSuccessNel("No value")
res2: ValidationNel[String, Int] = Success(1)

@ Validation.lift(-20)(_ <= 0, "Number should be positive")
res3: Validation[String, Int] = Failure("Number should be positive")

@ Validation.lift("a")(("\d+".r).findFirstIn(_).isEmpty, "Input must be a number")
res4: Validation[String, String] = Failure("Input must be a number")

@ Validation.lift("1")(("\d+".r).findFirstIn(_).isEmpty, "Input must be a number")
res5: Validation[String, String] = Success("1")

A List of Validation Values

When we have a list of validation results, we can combine the effects, and create a processing pipeline that will aggregate
all the errors or will return a list of values to return. The operation that we will be using is called sequenceU and is
part of scalaz.

@ import $ivy.<code>org.scalaz::scalaz-core:7.2.7, scalaz._, Scalaz._

@ val fn = Option.empty[String].toSuccessNel("Firstname is empty")
fn: ValidationNel[String, String] = Failure(NonEmpty[Firstname is empty])

@ val ln = Option.empty[String].toSuccessNel("Lastname is empty")
ln: ValidationNel[String, String] = Failure(NonEmpty[Lastname is empty])

@ val age = Option("27").toSuccessNel("Age is empty")
age: ValidationNel[String, String] = Success("27")

@ val zipcode = Option.empty[String].toSuccessNel("Invalid zipcode")
zipcode: ValidationNel[String, String] = Failure(NonEmpty[Invalid zipcode])

@ val validated = results.sequenceU
validated: Validation[NonEmptyList[String], List[String]] = Failure(NonEmpty[Firstname is empty,Lastname is empty,Invalid zipcode])

validated.fold(_.toList, identity).mkString(",")
res1: String = "Firstname is empty,Lastname is empty,Invalid zipcode"

Because we have errors, the sequenceU operation will collect all errors and return them as a list. When we have success values
we can process them as follows:

@ import $ivy.<code>org.scalaz::scalaz-core:7.2.7, scalaz._, Scalaz._

@ val fn = Option("Dennis").toSuccessNel("Firstname is empty")
fn: ValidationNel[String, String] = Success("Dennis")

@ val ln = Option("Vriend").toSuccessNel("Lastname is empty")
ln: ValidationNel[String, String] = Success("Vriend")

@ val age  = Option("43").toSuccessNel("Age is empty")
age: ValidationNel[String, String] = Success("43")

@ val validated = List(fn, ln, age).sequenceU
validated: Validation[NonEmptyList[String], List[String]] = Success(List("Dennis", "Vriend", "43"))                    ^                    ^

@ val result = validated.fold(_.toList, identity)
result: List[String] = List("Dennis", "Vriend", "43")

Processing AWS Java SDK errors Functional Style

The Validation FDS catches exceptions that are raised by eg. the AWS Java SDK. Use the fromTryCatchNonFatal HoF. Java raises exceptions when operations fail.
When deleting a bucket, the operation can fail. Validation catches the error as a Validation value. The List FDS aggregates all the errors.

@ import $ivy.<code>org.scalaz::scalaz-core:7.2.7, scalaz._, Scalaz._

@ import $ivy.com.amazonaws:aws-java-sdk:1.11.362, com.amazonaws.services.s3._, com.amazonaws.services.s3.model._, com.amazonaws.event._

@ val client = AmazonS3ClientBuilder.defaultClient()
client: AmazonS3

@ val results = List("foo", "bar", "baz").map(bucketName => Validation.fromTryCatchNonFatal(client.deleteBucket(bucketName)).leftMap(_.getMessage.wrapNel))
results: List[Validation[NonEmptyList[String], Unit]] = List(
  Failure(
    NonEmpty[The bucket is in this region: us-east-1. Please use this region to retry the request (Service: Amazon S3; Status Code: 301; Error Code: PermanentRedirect; Request ID: 41D03FA4E55C4D11; S3 Extended Request ID: aW+TQWmyxLWoa+zUPMU1ml7iTvoYpITKxW9tywlgvgJQS99lwvtdxH9Q8KRxQFFVkBXSaV63vcw=)]
  ),
  Failure(
    NonEmpty[The bucket is in this region: us-east-1. Please use this region to retry the request (Service: Amazon S3; Status Code: 301; Error Code: PermanentRedirect; Request ID: F8D9EE5008E28EC0; S3 Extended Request ID: uYcSr9e/S55eOsX5PyYE9HEBg/nRA1WHVmJdo72IXr4hZsYZC4zcMq8Y/kYbEuCBV1YELoLy0fE=)]
  ),
  Failure(
    NonEmpty[The bucket is in this region: us-east-1. Please use this region to retry the request (Service: Amazon S3; Status Code: 301; Error Code: PermanentRedirect; Request ID: 513BA210C54FCAE9; S3 Extended Request ID: 3GAvErrQY+kzLFl43k+wgra16LvG4BPdjCKJW/YdlwR//bsucdgYXGsTxtzLETa9iKtDl+YjfCM=)]
  )
)

@ results.sequenceU.fold(_.toList, _ => List.empty[String]).mkString(",")
res12: String = "The bucket is in this region: us-east-1. Please use this region to retry the request (Service: Amazon S3; Status Code: 301; Error Code: PermanentRedirect; Request ID: 41D03FA4E55C4D11; S3 Extended Request ID: aW+TQWmyxLWoa+zUPMU1ml7iTvoYpITKxW9tywlgvgJQS99lwvtdxH9Q8KRxQFFVkBXSaV63vcw=),The bucket is in this region: us-east-1. Please use this region to retry the request (Service: Amazon S3; Status Code: 301; Error Code: PermanentRedirect; Request ID: F8D9EE5008E28EC0; S3 Extended Request ID: uYcSr9e/S55eOsX5PyYE9HEBg/nRA1WHVmJdo72IXr4hZsYZC4zcMq8Y/kYbEuCBV1YELoLy0fE=),The bucket is in this region: us-east-1. Please use this region to retry the request (Service: Amazon S3; Status Code: 301; Error Code: PermanentRedirect; Request ID: 513BA210C54FCAE9; S3 Extended Request ID: 3GAvErrQY+kzLFl43k+wgra16LvG4BPdjCKJW/YdlwR//bsucdgYXGsTxtzLETa9iKtDl+YjfCM=)"

Conclusion

Creating processing pipelines is very easy in Scala. Scala supports both FP and OO and comes with standard FDS structures. Some advanced
FDS structures must be imported from libraries like Scalaz. With a few lines of code we can express a powerful processing pipelines
that are easy to reason about. I use this style of problem solving a lot. FP saves me time, reduce complexity, and results in modular code.
I can always test my code and reuse functions in other settings.

Questions?

Get in touch with us to learn more about the subject and related solutions

Explore related posts