top of page




with modern big data technologies

 Our experts designed, maintained and implemented dozens of Big Data solutions on either small (couple nodes) or huge (>10 PB) clusters. Data integration, storage concepts, business logic, monitoring, development environment, testing, delivery processes around - these are the things we can help you with.


We do Kafka, Hadoop, Spark, Flink, Jupyter, Scala, Python and much more. And we can take care of your cluster as a whole - happy to implement your product owner’s requirements, or we can help your engineering teams with challenging tasks by bringing our experts on board. 


Scope of a project we are involved in typically implements integration interfaces, business logic pipelines and delivers regular reports or forwards data on a regular basis for further processing. 

  • Design and technology stack proposal

  • Development environment maintenance - code maintenance, continuous integration & delivery, integration with PM tools

  • Quality assurance - with focus on automation and test driven development.

  • Monitoring - integration with a company’s monitoring systems, support 

  • Delivery - integration with your company’s business processes.


Our solutions run on bare metal or cloud clusters (AWS).

We cover all

the engineering tasks


bottom of page