Mykola is a java backend developer (calls himself full-stack). He has a keen interest in CI/CD, testing, and everything that helps to move faster without breaking too many things. Since 2015 he works at bol.com, one of the biggest online retailers of the Netherlands.
Do your automated tests serve you well? Or does it seem like they are deliberately getting in a way, slowing down any change? Has the “given”’s preparation become an arduous journey littered with shortcuts? Ever increasing delivery pace raises the bar for the test automation quality. Many teams are good at producing decent test coverage, but often at the price of tests maintainability, which can quickly become a bottleneck. Advanced testing frameworks like Cucumber, Spock, etc., can bring remedy to some of these pains, but add its own complexity, overhead and compromises. With the help of Kotlin’s language features such as named functional parameters, default parameter values, and especially DSLs (Type-Safe Builders), the expressiveness of tests can be drastically improved. Add a modern assertion framework like AssertJ, and even plain JUnit can be a very versatile testing tool for both unit- and functional testing with relatively low overhead and entry barrier. This demo will show how to use the features of Kotlin to fight the rising complexities of the automated test data preparation. The material is based on the real-world experience of using Kotlin for testing of backend micro-services (Kotlin or vanilla Java) at bol.com - one of the largest webshops in the Netherlands.
Scheduled on Saturday from 17:40 to 18:30 in Room 4
Thorough testing before merging to master is great, but it doesn’t reveal the unknowns. Staging on shared environments tends to be slow, unreliable and costly to support. Why not just learn from the only true environment by conducting safe and efficient experiments? This talk is based on my experience of increasing the delivery rate within the context of back-end systems of bol.com (one of the biggest online retailers of the Netherlands; logistics and purchasing domains), where correctness is often a bigger concern than performance, and recovery might require a bit more than users hitting the refresh button of their browser. Testing on production is often associated with A/B testing or canary releases, but those aren't always the best - or even applicable - techniques. We’ll look instead at shadow and dry runs, controlled experiments, survival of the fittest; how to apply these techniques and what to be aware of.
Scheduled on Saturday from 14:25 to 14:50 in Room 5