DevOps Engineer at Cyberjin
The ideal candidate believes in exploring alternatives and quickly prototyping to validate hypothetical architectures or solutions.
Minimum Qualifications
- Security Clearance - A current Secret is required and therefore all candidates must be a U.S. citizen.
- 5+ years of experience in DevOps Engineering or Software Development (Java preferred) and Bachelors in related field; or 3 years relevant experience with Masters in related field; or High School Diploma or equivalent and 9 years relevant experience.
- Have a strong working knowledge of Linux systems, hosts, storage, networks, security, applications and proficiency in shell scripting (Shell/Bash, JavaScript, Python).
- Excellent oral and written communication skills.
- Must have a Security+ certification.
- Must be able to work in a hybrid environment.
Preferred Requirements
- Must have experience with big data technologies such as Hadoop and NoSQL Databases. Experience with AWS is highly desired.
- Prior experience or familiarity with Unified Platform (UP) Big Data Platform (formerly owned by DISA) is a plus.
- Data parsing/transforming techniques to include JSON, XML, CSV formats.
- Understanding of AGILE software development methodologies and use of standard software development tool suites. (e.g., JIRA, Confluence, Github Enterprise, etc.)
- Willing to do on-call/pager duty is a big plus. Possible rotating shift in the future for this role as the current team is full.
Hybrid Role
Looking for a DevOps Engineer with prior experience with Big Data Solutions, Cloud technology, and strong working knowledge of Linux. Passionate about the concept of infrastructure as code and leverages modern tools to define, build and manage virtual infrastructure in the cloud. Work is performed in a hybrid environment with a great team.
Essential Job Responsibilities
The ideal candidate believes in exploring alternatives and quickly prototyping to validate hypothetical architectures or solutions.
Will significantly contribute to the development of custom software components and integration of open source code to address complex time series analysis problems through the use of cutting edge Big Data/ Cloud technology. Design, implement, and maintain core architecture and capabilities for software from prototype to operational applications.
Must understand software engineering fundamentals, OO programming, relational and time series databases, scripting knowledge and a basic level of development operations (DevOps) skill set.
Minimum Qualifications
Security Clearance - A current Secret is required and therefore all candidates must be a U.S. citizen.
5+ years of experience in DevOps Engineering or Software Development (Java preferred) and Bachelors in related field; or 3 years relevant experience with Masters in related field; or High School Diploma or equivalent and 9 years relevant experience.
Have a strong working knowledge of Linux systems, hosts, storage, networks, security, applications and proficiency in shell scripting (Shell/Bash, JavaScript, Python).
Excellent oral and written communication skills.
Must have a Security+ certification.
Must be able to work in a hybrid environment.
Preferred Requirements
Must have experience with big data technologies such as Hadoop and NoSQL Databases. Experience with AWS is highly desired.
Prior experience or familiarity with Unified Platform (UP) Big Data Platform (formerly owned by DISA) is a plus.
Data parsing/transforming techniques to include JSON, XML, CSV formats.
Understanding of AGILE software development methodologies and use of standard software development tool suites. (e.g., JIRA, Confluence, Github Enterprise, etc.)
Willing to do on-call/pager duty is a big plus. Possible rotating shift in the future for this role as the current team is full.
X2tZUPC7h6
Employers will see your profile when they are sending a job in your skill.
Create Your Profile (simple)