We trust that teamwork is the key to unlocking extraordinary achievements
At Whiteklay, we embrace continuous improvement as our foundation. Every day, we refine our skills and push for excellence in everything we do.
Empathy
We prioritize understanding and compassion, ensuring that our solutions resonate with the unique needs of our clients.
Life at Whiteklay
Collaboration is at the heart of what we do. From understanding your goals to delivering tailored solutions, we partner with you every step of the way.
What Your Future Co-Workers Say?
We blend cutting-edge technology with eco-friendly practices to build structures that stand the test of time. Our commitment to quality, efficiency, and sustainability ensures every project is built for the future.
Peter Parker
Michael Carter
Daniel Vaughn
Sophia Reynolds
Join Our Team
DevOps Engineer See detail
Job Title:
SDE-I, Devops Engineer, Staff Engineer
Responsibilities:
- Apply system patches and address security vulnerabilities to ensure platform integrity.
- Provision and manage Nutanix VMs.
- Support Kubernetes-based environments for containerized applications (experience preferred).
- Troubleshoot OS-level issues, perform root cause analysis, and resolve incidents.
- Collaborate with development teams to address system-related technical challenges.
- Automate repetitive tasks using scripting (e.g., Shell, Python) to improve efficiency.
- Monitor system performance, optimize configurations, and maintain high availability.
- Document processes, SLAs, and maintain knowledge bases for operational support.
- Participate in on-call, part of rota and provide flexibility during major outages.
Required Skills:
- 3+ years in Linux, Redhat OS support, Opensource software, technical operations.
- Strong knowledge of Linux/Unix systems, os-level , package-level patch management, vulnerability fixes.
- Familiarity with Nutanix Hypervisor, KVM and knowledge of VM networking, provisioning.
- Knowledge of Networking, Kubernetes is preferred.
- Proficient in scripting either 1 or 2 languages (Shell, Python) and automation tools.
- Experience with monitoring tools, incident management, and ITIL frameworks.
- Strong problem-solving skills and attention to detail.
DevOps Engineer See detail
Job Title:
SDE-I, Devops Engineer, Staff Engineer
Responsibilities:
- Apply system patches and address security vulnerabilities to ensure platform integrity.
- Provision and manage Nutanix VMs.
- Support Kubernetes-based environments for containerized applications (experience preferred).
- Troubleshoot OS-level issues, perform root cause analysis, and resolve incidents.
- Collaborate with development teams to address system-related technical challenges.
- Automate repetitive tasks using scripting (e.g., Shell, Python) to improve efficiency.
- Monitor system performance, optimize configurations, and maintain high availability.
- Document processes, SLAs, and maintain knowledge bases for operational support.
- Participate in on-call, part of rota and provide flexibility during major outages.
Required Skills:
- 3+ years in Linux, Redhat OS support, Opensource software, technical operations.
- Strong knowledge of Linux/Unix systems, os-level , package-level patch management, vulnerability fixes.
- Familiarity with Nutanix Hypervisor, KVM and knowledge of VM networking, provisioning.
- Knowledge of Networking, Kubernetes is preferred.
- Proficient in scripting either 1 or 2 languages (Shell, Python) and automation tools.
- Experience with monitoring tools, incident management, and ITIL frameworks.
- Strong problem-solving skills and attention to detail.
Data Engineer See detail
Job Description:
We are looking for a Data Engineer to build and maintain robust data pipelines and ensure data availability for analytical purposes.
Responsibilities:
- Design and maintain data pipelines using Spark, Hive, and Hadoop.
- Develop ETL processes with Talend and DBT.
- Write and optimize advanced SQL queries.
- Manage workflows using Airflow and Jenkins/GitHub Actions/GitLab.
- Implement data governance practices.
- Utilize GCP and AWS for cloud-based solutions.
Required Skills:
- Proficiency in big data technologies (Spark, Hive, Hadoop, NiFi, Kafka).
- Experience with Python, Talend, SQL (Advanced), and databases.
- Familiarity with Airflow, DBT, and cloud platforms (GCP, AWS).
- Basic understanding of data governance.
- Exposure to CI/CD tools like Jenkins, GitHub Actions, or GitLab.
SE See detail
Job Description:
We are a young team of highly motivated & experienced individuals. We have a fun, creative and open culture. Commitment & integrity, consumer focus, innovation and team work are the pillars on which the organization is being built and what we care most about. We believe in the power of team work and would like to build a world class team which delivers the best to help us achieve our goals.
Core Skills:
- Java
- Spring boot
- Gradle or Maven (either any one technology)
- GIT ( very important)
- MySQL or SQL or MongoDB (either any one
- Testing or Debugging or Unit Testing
Additional Skills:
- If candidate have this skills then its an added advantage.
- Microservices or Rest API
- Design patterns
- Event Driven Systems or Queue based systems
- Dockers
SSE See detail
Job Description:
We are a young team of highly motivated & experienced individuals. We have a fun, creative and open culture. Commitment & integrity, consumer focus, innovation and team work are the pillars on which the organization is being built and what we care most about. We believe in the power of team work and would like to build a world class team which delivers the best to help us achieve our goals.
Core Skills:
- Candidate must have worked on complete
- Module or Project or Component
- QBase systems (very important)
- API Gateway / Controllers
- Dockers (important)
- Java
- Springboot
- Microservices or Rest API
- Gradle or Maven (either any one technology)
- GIT ( very important)
- MySQL or SQL or MongoDB (either any one)
- Testing or Debugging or Unit Testing
- Design patterns
Additional Skills:
- If candidate have this skills then its an added advantage.
- GCP
- AWS
- Azure
- Event Driven Systems or Queue based systems
TL See detail
Job Description:
We are a young team of highly motivated & experienced individuals. We have a fun, creative and open culture. Commitment & integrity, consumer focus, innovation and team work are the pillars on which the organization is being built and what we care most about. We believe in the power of team work and would like to build a world class team which delivers the best to help us achieve our goals.
Core Skills:
- Candidate must have 1 or 2yrs of experience in managing a team
- JIRA and Confluence
- Agile and Scrum
- QBase systems (very important)
- API Gateway / Controllers
- Dockers (important)
- Java
- Springboot
- Microservices or Rest API
- Gradle or Maven (either any one technology
- GIT ( very important)
- MySQL or SQL or MongoDB (either any one)
- Testing or Debugging or Unit Testing
- Design patterns
Additional Skills:
- If candidate have this skills then its an added advantage.
- GCP
- AWS
- Azure
- Event Driven Systems or Queue based systems
Associate Data Engineer See detail
Job Description:
Join our team as an Associate Data Engineer to assist in building scalable data pipelines and maintaining data integrity.
Responsibilities:
- Support the development of data pipelines using big data technologies.
- Assist in creating ETL workflows using Talend and DBT.
- Write and test SQL queries for data analysis.
- Support workflow orchestration using Airflow.
- Help implement data governance best practices.
Required Skills:
- Basic knowledge of big data technologies (Spark, Hive, Hadoop, NiFi).
- Familiarity with Python, SQL, and databases.
- Exposure to DBT, Airflow, and cloud platforms (GCP, AWS).
- Eagerness to learn CI/CD tools and data governance principles.
Don’t See The Right Position For You?
Send us your resume. If a position opens that matches your skill set, we’ll notify you.
