Overview
Appsbroker is Google’s largest Premier Partner in Europe. We deliver leading-edge cloud services to some of the world’s most exciting brands, in sectors from retail through automotive to finance.
This role will expose you to some of the most exciting Cloud transformation and optimisation project opportunities utilising GCP. Appsbroker delivers its services through the Google Cloud Platform (GCP), using some of the world’s most exciting new large-scale data tools, such as BigQuery and BigTable.
You will be joining a dedicated Data and Analytics team who concentrate on using data to transform the way the world works. As a Cloud Data Engineer, you will be working on leading-edge cloud data transformation and optimisation projects for Global Brands. Assisting to architect solutions that meet complex client problems.
Role Summary
As a Cloud Data Engineer you will play a key role in the Data & Analytics Team designing cloud enterprise solutions architecture and helping to build big data solutions for customers by integrating customers’ structured, semi-structured and unstructured data on low latency platforms. You will work to solve complex business including the use of Big Data, Auto-ML, ML and visualisation technologies.
Training and certification on Google Cloud Platform (GCP) are provided as part of the extensive on-boarding program. The role offers you an exciting long term career with us as a fast-growing market leader.
Responsibilities
In this role, you will have the opportunity to:
Working in a team to consult, design, coordinate architecture to modernise infrastructure for performance, scalability, latency, reliability
Possess in-depth knowledge of and able to consult on various technologies such as database, data warehousing and big data . BigQuery, Oracle, Redshift, Teradata, Hadoop
Design data storage, movement and orchestration solutions
Create visualisations, dashboards and MIS reports
Identify, scope and participate in the design and delivery of cloud data platform solutions
Design and execute a platform modernisation approach for customers’
data environments
Design, coordinate and execute pilots, prototypes or proof of concepts,
provide validation on specific scenarios and provide deployment guidance
Document and share technical best practices/insights with engineering colleagues and the architect community
Design Data Architectural solutions for structured data technologies and understand
unstructured data technologies
Create and maintain appropriate standards and best practices around Google Cloud SQL, BigQuery and other data technologies
Define and communicate complex database concepts to technical and non-technical people
Core Skills
● Strong analytical and design skills around the full end-to-end ETL lifecycle of data pipelines in large enterprise settings
● History of working with Data warehouse solutions (on-premise & Cloud)
● Hands-on enterprise data migration
● Good experience of SQL on relational & non-relational database (RedShift, BigQuery, MySQL, etc)
● Practical experience of data curation, cleansing, creation and maintenance of data sets for business purposes
● Experience in data science modelling, analytics & BI reporting (Tableau, Looker, Periscope, DataStudio)
● Knowledge and experience of common open source languages for data integration (Python, Go, , Java)
● Understanding of industry standard containerisation approaches (Docker, Kubernetes, etc) would be advantageous
● Experience in building scalable data pipelines using Hadoop Spark clusters would be desirable
● Strong technical skills, aligned with good analytical skills
● Project delivery within Agile methodology and planning processes
● Experience with System and User Acceptance Testing
● Must have good verbal communication skills to explain complex solutions to customers in a clear and concise manner, aligned with the ability to plainly communicate to all stakeholders at all levels within an organisation
What we offer to you
Location
Job Type
Shift