Ein Unternehmen der Tenth Revolution Group

Ihre aktuelle Jobsuche

24 Suchergebnisse

Für Festanstellung und freiberuflich in USA

    Sr Data Engineer - Houston

    USA, Texas, Houston

    • $170,000 to $180,000 USD
    • Engineer Stelle
    • Fähigkeiten: ETL, Python, Snowflake, SQL
    • Seniority: Senior

    Jobbeschreibung

    Our TOP CLIENT has retained us for a Sr Data Engineer role. They are known and highly recognized for their stability and commitment to a collaborative teamwork environment. Our client is building a best-in-class data science platform as being on the forefront of data management and analytics is core to our investment platform. Overall, this position will play an integral role as the team implements new data management platforms, creates new data ingestion pipelines, and sources new data sets. The role will assist with all aspects of data from data architecture design to on-going data management and will have significant exposure to our Risk and commercial investing teams globally.



    REQUIRED:

    Must be onsite 3 days a week in the Houston, TX area

    Extensive work experience with ETL/ELT frameworks to write pipelines.

    Advanced skills in writing highly optimized SQL code.

    Experience with Snowflake.



    TOP RESPONSIBILITIES:

    Execute data architecture and data management projects for both new and existing data sources.

    Help transition existing data sets, databases, and code to a new technology stack.

    Over time, lead analysis of data sets using a variety of techniques including machine learning.

    Manage end to end data ingestion process and publishing to investing teams.

    Own the process of mapping, standardizing, and normalizing data.

    Ad hoc research on project topics such as vendor trends, usage best practices, big data trends, artificial intelligence, vendors, etc.

    Help transition existing data sets, databases, and code to a new technology stack.

    Assess data loads for tactical errors and build out appropriate workflows, as well as create data quality analysis that identifies larger issues in data.

    Actively manage vendors and capture changes in data input proactively.

    Properly prioritize and resolve data issues based on business usage.

    Assist with managing strategic initiatives around big data projects for the commercial (trading) business.

    Partner with commercial teams to gain understanding of current data flow, data architecture, investment process as well as gather functional requirements.

    Assess gaps in current datasets and remediate.



    PREFERRED SKILLS:

    Experience in SQL programming, data architecture, and dimension modeling.

    Interest and passion for data architecture, analytics, management, and programming.

    Experience in energy commodities or financial services required.

    Experience in mapping, standardizing, and normalizing data.

    Experience with data integration platforms is preferred, and SnapLogic experience is highly preferred.

    Extensive work experience with ETL/ELT frameworks to write pipelines to load millions of records.

    Advanced skills in writing highly optimized SQL code.

    Experience with relational databases Snowflake or Oracle is preferred.

    Python work experience with Pandas, Numpy and Scikit

    Ability to communicate and interact with a wide range of data users - from very technical to non-technical.

    Team player who is execution focused, with the able to handle a rapidly changing set of projects. and priorities, while maintaining strong professional presence.

    Strong analytics skills with demonstrated attention to details.

    Familiarity with Business intelligence tools Power BI and Tableau.

    Interest or experience in machine learning/Artificial Intelligence is a plus.



    BENEFITS:

    Competitive comprehensive medical, dental, retirement and life insurance benefits

    Employee assistance & wellness programs

    Parental and family leave policies

    CCI in the Community: Each office has a Charity Committee and as a part of this program employees are allocated 2 days annually to volunteer at the selected charities.

    Charitable contribution match program

    Tuition assistance & reimbursement

    Quarterly Innovation & Collaboration Awards

    Employee discount program, including access to fitness facilities

    Competitive paid time off

    Continued learning opportunities



    Reach out to me directly if interested:

    E: s.murray@jeffersonfrank.com

    Data Quality Analyst

    USA, Illinois, Champaign

    • $80,000 to $90,000 USD
    • Analyst Stelle
    • Fähigkeiten: AWS, PostgreSQL, SQL, Python, pgAdmin, Toad, DataGrip
    • Seniority: Mid-level

    Jobbeschreibung

    Our TOP CLIENT has retained us for a Data Quality Analyst role. They are known and highly recognized for their stability and commitment to a collaborative teamwork environment. The Data Quality Analyst will be responsible acquiring data from primary or secondary sources. This position involves obtaining legacy data from the former billing system, and mapping it to fit the billing system data model.



    Location: Remote

    Salary: $80K to $90K



    Must Have skills:

    Strong SQL skills

    Python scripting

    Database tools like pgAdmin, Toad, or DataGrip

    Relational Databases experience (any)



    Data collection: Acquiring data from primary or secondary sources.This position involves obtaining legacy data from the former billing system, and mapping it to fit our billing system data model. We sometimes do this ETL work internally, and/or work with ETL partners. Analytical skills are important tool, and just being comfortable working with data files.



    * Data interpretation: Using statistical techniques to analyze data and draw logical conclusions.
    * Data visualization: Using tools to identify patterns and present findings to stakeholders.
    * Report and presentation development: Creating reports and presentations to explain data to stakeholders, IT representatives, and other data analysts.
    * Data quality control: Validating and linking data, and providing quality assurance for imported data.
    * Data protection: Understanding data protection issues and processing confidential data according to guidelines.
    * Confirms project requirements by studying client requirements: collaborating with others on development and project teams.
    * Works closely with project managers to understand and maintain focus on their analytics needs, including critical metrics and KPIs, and deliver actionable insights to relevant decision-makers
    * Proactively analyze data to answer key questions for stakeholders or yourself, with an eye on what drives business performance, and investigate and communicate which areas need improvement in efficiency and productivity
    * Create and maintain rich interactive visualizations through data interpretation and analysis, with reporting components from multiple data sources
    * Gather, interpret, and present data to help others understand and solve problems.
    * Turn project requirements into custom-formatted reports, and ability to analyze business procedures and recommend specific types of data that can be used to improve upon them.
    * Develop, implement, and maintain leading-edge analytics systems, taking complicated problems and building simple frameworks.

    DevOps Engineer - AI

    USA, Pennsylvania, Bala Cynwyd

    • $130,000 to $150,000 USD
    • DevOps Stelle
    • Fähigkeiten: AI (artificial intelligence), DevOps, Machine learning, Python
    • Seniority: Mid-level

    Jobbeschreibung

    DevOps Engineer (AI Focused)

    Location: Hyrbid in Philadephia

    Compensation: $130,000 - $150,000

    * Relocation support available

    Overview:
    This role offers an exciting opportunity to join a collaborative, learning-focused team. The organization is forming a brand-new team dedicated to a cutting-edge chatbot project aimed at housing proprietary customer information. This is an impactful role, as you will help shape the team and make key decisions.

    Key Responsibilities:

    * Work on the development of a proprietary chatbot project.
    * Play a pivotal role in forming a new team and contributing to technical decision-making.
    * Collaborate with cross-functional teams in a fast-paced, innovative environment.

    Qualifications:

    * Experience: Around 5 years in relevant fields.
    * DevOps Expertise: Strong understanding required.
    * Programming: Proficiency in Python is mandatory.
    * AI Expertise: At least 2 years of hands-on experience with AI technologies.
    * Cloud Platforms: Experience with Azure, AWS, or GCP.
    * Containerization: Familiarity with Docker and Kubernetes (nice to have).
    * Tech Stacks: Exposure to ELK stack and RAG stack is a plus.

    Culture:

    * Collaborative and innovative environment.
    * Commitment to continuous learning and professional growth.
    * Fun and dynamic office atmosphere.

    Note: This is an opportunity to work on a groundbreaking project while contributing to a growing team within an established organization.

    AWS Cloud Infrastructure Engineer

    USA, New Jersey, Whippany

    • $130,000 to $135,000 USD
    • Engineer Stelle
    • Fähigkeiten: Amazon Cloudformation, Amazon EC2, Amazon ECS, Amazon Lambda, Amazon Route53, Amazon S3, AWS VPC
    • Seniority: Mid-level

    Jobbeschreibung

    Job Summary: We are seeking a skilled AWS Cloud Infrastructure Engineer with expertise in serverless architecture to join our team. In this role, you will play a crucial part in managing our cloud-based infrastructure, which relies extensively on AWS services such as Lambda and API Gateway. You should possess strong troubleshooting skills and a profound understanding of serverless technology. Familiarity with Infrastructure as a Service (IaaS) tools, including S3, Route 53, and CloudWatch, is essential.

    Key Responsibilities:

    * Provision Infrastructure as Code (IaC) using scripts to implement and maintain a highly scalable and redundant global AWS environment.
    * Manage the transition of on-premises infrastructure to cloud-based solutions, ensuring optimal performance of ConnectiveRx applications through ongoing operational cloud management.
    * Conduct routine maintenance, upgrades, and optimization of cloud infrastructure and services.
    * Provide day-to-day support and management for infrastructure, which includes overseeing third-party hosted data centers and cloud-based systems.
    * Assist in creating and executing the migration process for legacy systems to our cloud environments, providing regular management updates on progress.
    * Take ownership of AWS environment management, ensuring best practices are followed in building and maintaining applications within AWS.
    * Design and implement resilient web infrastructure for patient assistance programs, ensuring website and database availability and performance through the use of web application firewalls, load balancers, Route 53, EC2, web servers, and VPC configurations.
    * Articulate and describe strategies for setting up and optimizing cloud environments to ensure resilience and efficiency.

    Preferred Qualifications:

    * Proven experience in a large enterprise environment with a focus on AWS cloud management.
    * In-depth knowledge of various AWS services, including but not limited to VPC, Elastic Load Balancers, Auto Scaling Groups, EC2, ECS, DNS, as well as AWS storage and notification services.
    * Strong understanding of security best practices related to cloud infrastructure.
    * Excellent problem-solving abilities and a proactive approach to infrastructure management.