Benefits

About Working Here


We are small, but we think big. And we want you to be a part of our team.

Certifications

Receive 100€ annually towards a personal development opportunity of your choice (including books or conferences)

Learning & Development

40+ working hours that you can use to pursuit your own education career path plus additional workshops

Extra days-off

We give your birthday and christmas extra days off.

External activities

We have multiple extra and external activities (Spring, Summer, Autnum, Christmas) for team building

Work Remotely

We provide all the tools so that you can work with customers that rely on our Global support from anywhere.

Salary above average

Accordingly to the experience that you have, we want to make sure that you are happy with your terms.

Health Insurance

We take serious your health, so everyone has a private health insurance included

Jobs

Open Positions


The following list displays our current open positions. Visit us regulary to find new or updated roles that we are looking for!


What you’ll do and key responsabilities

Main requirements

Submit your application

You will be integrated into the Data Engineering team, being responsible for helping maintain and improve the Big Data architecture and tools.


What you’ll do and key responsabilities
  • Design and build scalable & reliable data pipelines (ETLs) for our customers data platforms
  • Constantly evolve data models & schema design of Landing Zones, Operational Data Stores and De-normalized models to support self-service needs (experience in DW and DQ projects are a plus)
  • Design and build outputs to be explored and used by business stakeholders (Tables, Views or Reports using commercial tools like Tableau, Oracle BI, Qlikview)
  • Work cross-functionally with various teams, creating solutions that deals with large volumes of data
  • Work with the team to set and maintain standards and development practices

Main requirements
  • You have 1+ years of experience in building and maintaining data pipelines in Hadoop clusters (Cloudera is a plus) using HDFS, Hive, Kafka and Spark
  • You have 2+ years in a Data Warehouse **environment with varied forms of data infrastructure, including relational databases, Hadoop, and Column Store
  • Good experience in creating and evolving dimensional data models & schema designs to improve accessibility of data and provide intuitive analytics
  • You are familiar with continuous delivery principles: version control, unit and automated tests.
  • Skilled in one of the following programming languages: SQL, Java, Python, Spark
  • Fluent in English, both written and spoken;
  • You have good analytical and problem solving skills; Ability to work in a fast moving operational environment; Enthusiastic and with a positive attitude;

Submit your application

We are looking for a data scientist that will help us discover the information hidden in vast amounts of data, and help us make smarter decisions to deliver even better products. Your primary focus will be in applying data mining techniques, doing statistical analysis, and building high quality prediction systems integrated with our products


What you’ll do and key responsabilities
  • Selecting features, building and optimizing classifiers using machine learning techniques
  • Data mining using state-of-the-art methods
  • Extending company’s data with third party sources of information when needed
  • Enhancing data collection procedures to include information that is relevant for building analytic systems
  • Processing, cleansing, and verifying the integrity of data used for analysis
  • Doing ad-hoc analysis and presenting results in a clear manner
  • Development of both customer and internal AI/ML projects

Main requirements
  • Excellent understanding of machine learning techniques and algorithms, such as k-NN, Naive Bayes, SVM, Decision Forests, etc.
  • Experience with data visualisation tools, such as D3.js, GGplot, etc
  • Good applied statistics skills, such as distributions, statistical testing, regression, etc.
  • Data-oriented personality
  • You are familiar with continuous delivery principles: version control (e.g.: Git), unit and automated tests.
  • Skilled in one of the following programming languages: SQL, Java, Python, Spark
  • Fluent in English, both written and spoken;
  • You have good analytical and problem solving skills
  • Enthusiastic and with a positive attitude

Submit your application

You will be integrated into the Data Support team, being responsible for helping maintain the BI architecture and data pipelines (ETL).


What you’ll do and key responsabilities
  • Support and maintain data pipelines (ETLs) for our data platform
  • Constantly evolve data models & schema design of our Data Warehouse to support self-service needs
  • Monitor and report performance metrics
  • Work with the team to set and maintain standards and development practices

Main requirements
  • 2+ years of experience in building and maintaining data pipelines in a custom or commercial tool/application/framework
  • 2+ years of experience in SQL
  • Good experience in creating and evolving dimensional data models & schema designs to improve accessibility of data and provide intuitive analytics
  • You are familiar with continuous delivery principles: version control (e.g.: Git), unit and automated tests.
  • Skilled in one of the following programming languages: C#, Java, Python
  • Experienced in working with reporting tools (Excel, Tableau, QlikView, PowerBI, Looker, ..)
  • Fluent in English, both written and spoken;
  • You have good analytical and problem solving skills

Submit your application

You will be integrated into the Data Engineering team, being responsible for helping maintain and develop reports and dashboard, including user training and requirements definition.


What you’ll do and key responsabilities
  • Requirements definition accordingly to business needs
  • Constantly evolve existing reports and improving the design standards and guidelines
  • Design and build outputs to be explored and used by business stakeholders (Tables, Views or Reports using commercial tools like Qlikview and QlikSense)
  • Work cross-functionally with various teams, including business side and Data Engineering (DW)
  • Work with the team to set and maintain standards and development practices

Main requirements
  • You have 2+ years of experience in building and maintaining reports and dashboards using Qlikview
  • You have 2+ years in a Data Warehouse **environment with varied forms of data infrastructure, including relational databases (Hadoop and Column Store is a plus)
  • Good experience in creating and evolving dimensional data models & schema designs to improve accessibility of data and provide intuitive analytics
  • You are familiar with continuous delivery principles: version control, unit and automated tests.
  • Skilled in one of the following programming languages: SQL
  • Fluent in English, both written and spoken;
  • You have good analytical and problem solving skills

Submit your application

You will be focused on the development and support of the SharePoint infrastructure and SharePoint applications. The SharePoint Admin/Developer will act as the key SharePoint resource to the organization, owning the technical infrastructure, advocating for SharePoint’s collaboration features, training teams, and implementing SharePoint applications. They will perform Business Systems Analyst functions as required, mostly focused on SharePoint but always highlighting when other solutions should be considered. They will also train and support site owners.


What you’ll do and key responsabilities
  • Business Analysis for SharePoint projects; work with clients in a structured fashion to identify key project functional requirements.
  • Set up site pages, document libraries, and work flows as required
  • Work with clients to ensure that sites are tested fully and that there is a clear go-live plan
  • Develop SharePoint sites to meet business requirements, adhering to existing design standards.
  • Work with the team to set and maintain standards and development practices

Main requirements
  • You have 4+ years of experience in SharePoint 2013 and SQL 2014 administration with PowerShell and T-SQLPlan.
  • Plan, create, and manage Web Apps, Site Collections, and sub-sites including permissions, troubleshooting, and issue resolution.
  • IIS management
  • SharePoint Governance leadership and enforcement
  • Skilled in one of the following programming languages: C#, ASP.net, JavaScript
  • Fluent in English, both written and spoken;
  • You have good analytical and problem solving skills

Submit your application

We want to provide a real work experience, where you can take part in all the different stages of development. For this reason we designed this internship to develop a solution from scracth that is based on the best practices in software engineering, IoT, Cloud and Big Data. We believe this will bring us a lot of value and provide a great learning opportunity to you. Above all we hope that, at the end of your internship, you will have achieved something fantastic.


Starts on 5th of June and ends at 1st of September


What you’ll do and key responsabilities
  • A workplace at Viana do Castelo City center
  • 3 days of vacations to be schedule
  • Real experience of the world of software engineering, with people who on a daily basis find creative solutions to complex problems
  • All the tools to work (laptop, cloud resources, learning access)

Main requirements
  • Bachelor's, master's degree students or anyone with computer skills background
  • Share information and knowledge proactively
  • Being self-driven and work towards a common team or company purpose
  • Proactive
  • Speak and understand Portuguese and/or English
  • 12 weeks full time

Submit your application

We are always looking for new data engineers and data scientist. Submit here your CV and we will call you to have a quick chat!


What you’ll do and key responsabilities
To be defined
Main requirements
To be defined

Submit your application