meon this domain
What I can do
- create and deliver products for business and customer needs;
- understand the context of the business, its domain model and have a big picture view;
- quickly learn, read documentation for frameworks/APIs/third-party services/etc;
- organize and present my thoughts in the form of convenient notes and technical documentation;
- evaluate pro et contra to make the most rational decision;
- construct a coherent picture of the service or processes in general just from disparate pieces of information;
- defend your opinion in the discussion, supporting it with arguments, not emotions.
- test my own and, if necessary, other people’s code;
What I know, what I did and what I worked with
- I know the language itself and ecosystems around it well, including frameworks (Pylons, Django, Flask, aiohttp, FastAPI) and popular libraries as well;
- worked with asynchronous code;
- integrated C/Cython extensions for performance-critical computations;
- I understand ORM, know their strengths and weaknesses (created dialect support for ClickHouse)
- migrated and refactored huge code bases;
- separated microservices from monoliths.
- used the language as both «better Java» and «worse Haskell»;
- understand the concepts of functional programming and use them in practice;
- have experience with the cats ecosystem, a little less with ZIO;
- profiled and optimized GC in particular and JVM in general for different types of workloads;
- I’m familiar with systems on actors (Akka, a little less — Lagom)
DB and infrastructure
- worked with relational (MySQL, PostgreSQL, SQLite), non-relational (MongoDB, ES), and specialized (ClickHouse, DuckDB, TimescaleDB) databases;
- integrated search engines, Manticore and ElasticSearch;
- familiar with key-value databases and the specifics of their application;
- have pretty good experience with Kubernetes: from bare metal deployments (including Helm) to work in cloud environments, such as GCP environments, for example, GCP;
- know modern monitoring tools (Grafana/ELK/Graylog/Kibana), including their integrations into existing software.
- worket a lot with Apache-based stack (Hadoop, HDFS, Hive, Spark,…) to create data warehouse;
- developed and maintained in-house versions of Apache Spark;
- know how to create and debug ETL processes using AirFlow.
- Can understand modern frontend and JS (although I don’t really like to do it);
- long work with self-hosted solutions has given me system administration skills that often come in handy;
- For my hobby projects I use some other languages: JS/TS, Java, Rust, PHP, a little bit of Go, C++/Qt, C#.
- Pragmatist, passionate about what I do;
- I like to learn new things, study and apply the knowledge gained in practice;
- I have a good imagination and memory, I can explain clearly my ideas to others.
What I want to do
- Develop and implement effective, ethical solutions that improve people’s lives;
- write efficient and clean code, not quick’n’dirty-ASAP-mess;
- work in the team of smarter and more experienced people than me.
What I do NOT like
- so-called «body shops»;
- inadequate management;
Supported a distrubuted backend based on actor architecture (Akka and Akka-Persistence were used). Created a microservice for closed parkings.
Integrated some payment systems (local one and Stripe).
- Java, Scala
- Akka, Play, Tapir
- GCE, k8s
Creating architecture for DataLake, optimizing ETL processes.
- http4s, Play, ZIO, cats, KillBill
- Google Dataproc, Google Composer
2020 — fintech
Writing and supporting integrations with banks for KillBill and internal payment gateway.
- Scala, Java
- http4s, sttp, Play, ZIO, cats, Akka, Lagom (a bit of)
- KillBill, Google Cloud Engine
2017-2020 — data engineering
Building an internal user analytics processing architecture, developing tools, libraries (twilight, sparkle, cogwheel, glimmer) and documentation, running A/B tests.
Writing and maintaining tracking tools (Clerk), training and coaching for other teams.
Supporting the internal Apache Spark fork, deploying and partially supporting Hadoop cluster. Unification of all ETL processes on a single platform (AirFlow) and its tweaks to the company’s needs.
- Scala, Java, Python/Rust (a bit of)
- http4s, cats, Spark API, sbt
- Vagga, Lithos, Apache Spark, YARN, HDFS, Kafka, AirFlow, Jupyter Notebook
2014-2017 — backend
Mentorship in EVO Summer Python Lab.
- Python, JS
- aiohttp, Pylons (fork), GraphQL, rx.js
- Docker, PostgreSQL, MySQL, Kafka, ElasticSearch, Redis, webpack
Order tracking system design and development, backend optimization. Implementing CI/CD to deploy and test the project, monolith refactoring.
Develop architecture to collect user metrics, implementing it in Scala for better performance.
- Scala, Python
- Play, Flask, React.js
- Mercurial, Docker, Redis (Cluster), AWS
Integration of helpdesk with Yandex internal services.
- Python, JS
- Git, REST, Ansible
2011-2012: National Aviation University
Support for lib.nau.edu.ua, administration of the library network segment, servers and related services. Backend and classifier optimization for library catalog, speeding up book searches.