NEWPosted 5 hours ago
Job ID: JOB_ID_5813
Job Description: Java Backend Engineer with experience in Big Data and Spark.
- Keywords: card Arizona
- This role requires a strong background in Java backend development, coupled with significant experience in Big Data technologies and Apache Spark.
- The ideal candidate will be responsible for developing and maintaining robust backend systems that handle large volumes of data.
- Responsibilities include designing, implementing, and optimizing data processing pipelines.
- Collaborate with cross-functional teams to define, design, and ship new features.
- Ensure the technical feasibility of UI/UX designs.
- Optimize application for maximum speed and scalability.
- Assure that all user input is validated before submitting to back-end services.
- Collaborate with server-side developers and the design team to understand and implement user requirements.
- Develop and manage well-functioning databases and applications.
- Write effective, scalable code.
- Test and deploy applications and systems.
- Revise, update, repair, and enhance existing systems.
- Work with data scientists and analysts to improve data models and data pipelines.
- Troubleshoot and debug applications.
- Perform backend service development and integration.
- Contribute to the entire application lifecycle, focusing on coding and debugging.
- Stay up-to-date with new technology trends and opportunities for application improvement.
- Participate in code reviews to maintain code quality and share knowledge.
- Work with CI/CD pipelines for automated testing and deployment.
- Ensure security and data protection best practices are followed.
- Analyze and improve the efficiency and performance of the application.
- Develop and maintain technical documentation.
- Provide technical guidance and mentorship to junior developers.
- Engage in problem-solving and root cause analysis for production issues.
- Contribute to architectural discussions and decisions.
- Ensure compliance with industry standards and regulations.
- Build reusable code and libraries for future use.
- Implement security and data protection measures.
- Design and develop APIs.
- Work with cloud platforms (e.g., AWS, Azure, GCP) for deployment and scaling.
- Experience with distributed systems and microservices architecture is a plus.
- Familiarity with containerization technologies like Docker and Kubernetes is beneficial.
- Understanding of data warehousing concepts and ETL processes.
- Proficiency in SQL and NoSQL databases.
- Experience with messaging queues like Kafka or RabbitMQ.
- Knowledge of monitoring and logging tools.
- Ability to work independently and as part of a team.
- Excellent communication and interpersonal skills.
- Strong analytical and problem-solving abilities.
- Commitment to delivering high-quality software.
Special Requirements
15 years minimum experience. Visa type to be mentioned on email. Updated resume required.
Compensation & Location
Salary: $50 – $50 per year
Location: Phoenix, AZ
Recruiter / Company – Contact Information
Email: ry@itecsus.com
Recruiter Notice:
To remove this job posting, please send an email from
ry@itecsus.com with the subject:
DELETE_JOB_ID_5813