IBM
Software Engineer
Salary
$190k - $217k/YEAR
Job type
Full-time
Location
Armonk, New York, US
Remote
No
Posted
1 week ago
Software Developer Resume Example
See a professional resume example for this role with key skills, action verbs, and ATS-friendly formatting.
View resume exampleJob description
Your Role And Responsibilities
Software Engineer, IBM Corporation, Armonk, New York and various unanticipated client sites throughout the US (Up to 40% telecommuting permitted):
- Develop and enhance ETL (extract, transform, load) processes on Informatica PowerCenter from various sources to Oracle data warehouse.
- Lead a team of Data Engineers on assigned data warehouse ETL projects.
- Design and implement automated processes for regulatory reporting and calculations.
- Maintain documentation, runbooks, and incident records in compliance with audit requirements.
- Support applications, data pipelines and infrastructure for regulatory reports.
- Plan/conduct Informatica ETL unit and development tests and monitor the business ETL processing and troubleshoot issues identified.
- Create Unix and Python scripts for data ingestion, validation and process auditing.
- Monitor existing data flows developed on Apache Flink framework and work on enhancements.
- Build data pipelines to extract data from various sources, perform transformations, and load it into target systems.
- Develop and schedule Data pipelines and using Airflow DAGs.
- Work on migrating business processes from Informatica to Big data platform/technologies while performing testing and quality assurance.
- Develop PySpark applications to process and analyze large datasets efficiently including implementing complex data transformations, aggregations, and statistical operations.
- Maintain the code and versioning in Github.
- Write Oracle SQL and Hive queries to validate the data related to multiple financial reports.
- Support data warehouse month-end loads and monitoring to ensure successful completion.
- Utilize: Oracle SQL/PLSQL, Unix shell scripting, Java, Data Analytics and Integration, Informatica Power center - Extract, Transform, Load (ETL) Tool, Pyspark - Python API for Apache Spark, Apache Hive.
Required: Bachelor's degree or equivalent in Engineering or related and five (5) years of experience as a Managing Consultant, Engineer or related. Five (5) years of experience must include utilizing Oracle SQL/PLSQL, Unix shell scripting, Java, Data Analytics and Integration, Informatica Power center - Extract, Transform, Load (ETL) Tool, Pyspark - Python API for Apache Spark, Apache Hive. $189592 to $216700 per year. Full time. SN159
Required Technical And Professional Expertise
Bachelor's degree or equivalent in Engineering or related and five (5) years of experience as a Managing Consultant, Engineer or related. Five (5) years of experience must include utilizing Oracle SQL/PLSQL, Unix shell scripting, Java, Data Analytics and Integration, Informatica Power center - Extract, Transform, Load (ETL) Tool, Pyspark - Python API for Apache Spark, Apache Hive. $189592 to $216700 per year. Full time. SN159
Responsibilities
- Develop and enhance ETL (extract, transform, load) processes on Informatica PowerCenter from various sources to Oracle data warehouse
- Lead a team of Data Engineers on assigned data warehouse ETL projects
- Design and implement automated processes for regulatory reporting and calculations
- Maintain documentation, runbooks, and incident records in compliance with audit requirements
- Support applications, data pipelines and infrastructure for regulatory reports
- Plan/conduct Informatica ETL unit and development tests and monitor the business ETL processing and troubleshoot issues identified
- Create Unix and Python scripts for data ingestion, validation and process auditing
- Monitor existing data flows developed on Apache Flink framework and work on enhancements
- Build data pipelines to extract data from various sources, perform transformations, and load it into target systems
- Develop and schedule Data pipelines and using Airflow DAGs
- Work on migrating business processes from Informatica to Big data platform/technologies while performing testing and quality assurance
- Develop PySpark applications to process and analyze large datasets efficiently including implementing complex data transformations, aggregations, and statistical operations
- Maintain the code and versioning in Github
- Write Oracle SQL and Hive queries to validate the data related to multiple financial reports
- Support data warehouse month-end loads and monitoring to ensure successful completion
- Utilize: Oracle SQL/PLSQL, Unix shell scripting, Java, Data Analytics and Integration, Informatica Power center - Extract, Transform, Load (ETL) Tool, Pyspark - Python API for Apache Spark, Apache Hive
Qualifications
- Required: Bachelor's degree or equivalent in Engineering or related and five (5) years of experience as a Managing Consultant, Engineer or related
- Five (5) years of experience must include utilizing Oracle SQL/PLSQL, Unix shell scripting, Java, Data Analytics and Integration, Informatica Power center - Extract, Transform, Load (ETL) Tool, Pyspark - Python API for Apache Spark, Apache Hive
- Required Technical And Professional Expertise
- Bachelor's degree or equivalent in Engineering or related and five (5) years of experience as a Managing Consultant, Engineer or related
- Five (5) years of experience must include utilizing Oracle SQL/PLSQL, Unix shell scripting, Java, Data Analytics and Integration, Informatica Power center - Extract, Transform, Load (ETL) Tool, Pyspark - Python API for Apache Spark, Apache Hive
Benefits
- $189592 to $216700 per year
- Full time
- $189592 to $216700 per year
- Full time
Stand out from other applicants
AI reads this job description and tailors your resume to match, optimized for ATS filters.
Similar jobs
IBM
Armonk, US - $120k - $150k/YEAR
IBM
Armonk, US
IBM
Armonk, US
Cohen & Co
Armonk, US
Chase
Armonk, US
IBM
Armonk, US
Ready to land your next role?
Join thousands of professionals who use Mokaru to manage their job search. AI-powered resume tailoring, application tracking, and more.
Create Free Resume