Detalii loc de munca

ANUNT INACTIV: Angajatorul nu recruteaza momentan in mod activ pentru aceasta pozitie, dar poti aplica la anunt.

Senior Software Engineer (Java & AWS Cloud)

Angajeaza Premium Talent
Nivelul carierei
Senior (5-10 ani)
Tip job
Full time
Limbi vorbite
Engleză - Avansat
* toate limbile sunt obligatorii
Adresa
Posturi disponibile
1

Who We Are – MassMutual Romania

MassMutual Romania – in partnership with MassMutual in the United States – will help shape a culture of innovation and to create the digital products and technology solutions that help people secure their future and protect the ones they love.

Positioning MassMutual for its next 20 million customers and remaining innovative in a digital-first world led to the creation of MassMutual Romania in 2020. With offices in Bucharest and Cluj, MassMutual Romania was established to build an in-house team with expertise in application development and support, quality assurance and data science.

For 170 years, MassMutual has put its customers at the heart of what it does by providing holistic financial solutions, guidance, and education on their terms. Its long-term strategy helps ensure that policyowners and their loved ones can rely on them to be there when they need them most.

If this sounds like a fit, we’re looking to hire a Senior Software Engineer to join our strategic integration technology team.

Job Description

The Senior Software Engineer will work from our Cluj office while collaborating with the Global MassMutual teams and provide support in the cloud operations area. The person who will join this role will report to a local manager based in Cluj.

MassMutual’s Strategic Integration organization is looking for a self-starter independent Senior Software Engineer to join a cross-culture team with a start-up mentality. In this role you will contribute to the solution design and development of message streaming ETL transformation. You will refine your data analysis skills in order to build an innovative integration software.

Responsibilities

  • Define and implement highly available, fault-tolerant software solutions in containerized applications (J2EE, Kubernetes, EMR, SPARKS, Columnized data store (Big Data, Parque, JSON) capitalizing event messaging model (Kafka) in a cloud platform;
  • Define and influence innovative solutions that meet not only functional, but also performance, scalability and reliability, security requirements;
  • Define and build event based streaming data transformation platform that supports large scale data migration;
  • Collaborate with engineering teams to identify and resolve pain points as well as evangelize best practices;
  • Actively review code, mentor, and provide peer feedback;
  • Partner with various teams to transform concepts into requirements and requirements into services and tools;
  • Guide implementation best practices and assist other teams with implementation details in a variety of programming languages;
  • Engineer efficient, adaptable and scalable architecture in support of a variety of data applications;
  • Solve live performance and stability issues and prevent their recurrence
  • Deploying, automating, maintaining and managing AWS cloud-based systems, to ensure the high availability, performance/tuning, scalability and security of systems;
  • Execute Performance testing /tuning, monitoring, and troubleshooting tools;
  • Work with various teams including developers, QA, technical support, project managers, database administrators;
  • Assist in implementation of security best practices and initiatives at all levels of the Infrastructure.

Requirements

BS/BA required;

▪ + 6 years of experience implementing large scale enterprise solution in J2EE and Data Integration;

▪ Experience with Java 8+, Spring Boot, Spring data/Integration;

▪ Experienced in Java RESTful webservices;

▪ Experience with messaging frameworks, Business Process Modeling and Relational/Non-Relational Databases;

▪ Experienced with Messaging Framework implementing Event Messaging Service, JMS and message streaming in Kafka.

▪ Experienced through CI/CD pipeline (github/gitlab, Jenkins, Terraform, Cloudformation, Artifactory/Nexus, Docker, Kubernetes, AWS);

▪ Experienced in Aurora, Postgress DB, AWS Athena, Columnized data store (Parque, JSON) other cloud and Big-Data data stores is a plus;

▪ Independent and self-starter with extensive experience in an agile/adaptable environment and demonstrated success;

▪ Able to navigate existing application architecture and are cognizant of the challenges and recommendations in standard methodologies for supporting distributed systems;

▪ Comfortable making decisions with limited data;

▪ Excellent written and verbal communication;