Data Engineer



At CubicFarms we believe in the power of technology to feed a changing world. We actively seek talented people who want to make a difference in the ag tech space by bringing creative ideas to life and leaving an imprint in a healthier, more sustainable world. If you are enthusiastic about joining a dynamic and innovative company on the leading edge of the ag-tech revolution, we invite you to apply!


CubicFarms (TSXV:CUB) is an ag-tech company providing automated growing systems for fresh produce and nutritious livestock feed. CubicFarms offers turnkey, commercial scale, hydroponic, automated controlled environment growing systems that can grow predictably and sustainably for 12 months of the year virtually anywhere on earth. The company has sold and installed systems in Canada and internationally and continues to grow a global pipeline of prospective customers. Visit us at


We are seeking a Data Engineer to join our growing Technology team. Reporting to the Director of Data Science, you will be a pivotal member of the Technology team building the data tools for our customers and operations teams. We are building a data-driven approach to creating a customer experience by democratizing data in which we are delivering data to our people by building intuitive tools so that our stakeholders and consumers can learn independently. We work closely with our stakeholders to gather all the current business questions, so our data models are relevant. A lot of our data comes directly from our users. Every action a user takes in our product is tracked and analyzed, for the goals of identifying areas that are not working well and determining how we can improve. Working with our team you will be exposed to various facets of scientific data science such computer vision, machine learning, automation, predictive maintenance and more.


- 3-5 years of work experience with ETL, Data Modeling, and Data Architecture

- Bachelor’s degree in Computer Science or related technical field, or equivalent work experience.



- Knowledge of data processing pipelines and data science concepts

- Experience in processing large volumes of data.

- Hands-on experience in working with relational/NoSQL databases and distributed storage engines, and data warehousing.

- Hands-on experience in ETL design, optimization and development using cloud tools.

- Experience with building data pipelines and applications to stream and process datasets.

- A strong programming background in data ops (Python, Shell, SQL).

- Experience in working with streaming data will be an added advantage.

- A strong desire to continue to grow your skillset.

- Experience in database architectures and data pipeline development.

- Familiarity in DevOps practices and working in a Scrum Agile delivery environment.

- Knowledge of Azure services including Cosmos, TSI, Power BI, and IoT Hub is an asset.

- Strong communication skills that is influential and convincing.


- Build ETL code and SQL queries.

- Work with very large data sets across multiple systems for the data pipeline.

- Work with multiple internal and external infrastructure/project teams to understand and drive the existing data models and data needs.

- Provide support for current data flows into data applications such as homegrown Microsoft BI application and others, as necessary.

- Refine and decompose requirements to the level of detail required to initiate development.

- Deliver ETL and related components as specified in the design, functional and non-functional requirements, within established quality standards.

- Design, prototype, develop, test, and document existing and new ETLs, and related components; make recommendations that result in a more cost-effective product delivery.

- Perform unit testing of ETLs and related components and document test results.

- Perform troubleshooting on ETLs and related components.

- Identify application bottlenecks and opportunities to optimize performance.

- Innovate for automation frameworks, and tools for Data Management.

- Introduce stakeholders to data architecture and analytics best practices.

- Collaborate with PM, Platform and UX teams to deliver end-to-end data solutions and build data pipelines.

- Develop and maintain a clean data environment for analysts and data scientists to perform data modeling, mining, and machine learning.

- Prototype and iterate on designs and collaborate with other teams such as Growing and R&D


CubicFarms is an equal opportunity employer is committed to building a team that represents a variety of backgrounds, perspectives, and skills. We encourage applications from all qualified and enthusiastic candidates. CubicFarms is a growing organization in an emerging industry that will provide the right candidate with significant potential for advancement in the ag-tech space.

By clicking the button, I agree to the GetHired Terms of Service member? Login to Apply