Redshift jobs
We need a engineer to migrate Teradata to AWS RedShift. Please reach us in case you know both Teradata and RedShift. Thanks.
trying to port over windowing function from snowflake to redshift.
I'm looking for a freelancer to help with a project that includes database migration. The database I'm working with is a Relational database (MySQL) that needs to be migrated to Redshift. Successful applicants should include evidence of relevant past work in their application. In addition to the database migration, the project will also require extensive SQL coding, use of cloud platforms such as AWS and Redshift, and experience with advanced SQL. Candidates must also have experience with Data Warehousing and an Analytics Platform – the specific platform will depend on the nature of the project. As such, I'm seeking an experienced developer, ideally with a background in database migration who can demonstrate expertise in the tools and skills required. D...
My project is to install an array of VFX software on a Mac OSX system. This includes a number of independent license installers for Houdini, Renderman, Redshift, and Arnold. I need assistance with the setup and installation for these indie licenses on this operating system. I already have the licenses so just need the right kind of help to install the software correctly. This is a small, specific project that requires knowledge of the software, operating system, and ability to manage software installation in order to complete.
AWS Data Engineer with min of 6 to 9 years of experience (JD1) · Collaborate with business analysts to understand and gather requirements for existing or new ETL pip...AWS Data Engineer with min of 5 to 7 years of experience (JD 2) · Experience with AWS ( Glue, Lambda, Appflow, DyanmoDB,Athena, Step function,S3) · Experience with relational SQL and NoSQL databases like Mysql, Postgres, Mongodb and Cassandra. · Experience with data pipeline tools like Airflow, etc. · Experience with AWS cloud services like: EC2, S3, EMR, RDS, Redshift, BigQuery · Experience with stream-processing systems like: Storm, Spark-Streaming, Flink etc. · Experience with object-oriented/object funct...
Looking for Data Engineers having atleast 3 years of experience in end to ETL/ELT, data transformation...platforms can also apply Only candidates who can work in US timezone (EST/CST) apply AWS services such as Glue, Lambda, Athena, S3, SNS, Kinesis, Data-Pipelines, Pyspark, etc. Kafka/Kafka Connect, Spark, Flink or AWS Kinesis Apache Nifi Dataflow Kubernetes AWS Data Pipeline Snowflake GCP tools - GCS, GKE, BigQuery, Cloud SQL, Cloud Connector Golang Airflow Typescript Data: DBT, Fivetran, Redshift, PostgreSQL Infra: GitHub, Bazel, Docker Azure Data Factory Azure Databricks DAX MDX Terraform Visualization tools knowledge - Data Studio, Amplitude, Tableau Languages - Scala, Python, SQL Azure Synapse Mapping Data Flow Interested candidates can whatsapp on +1-81...
...EMR, Hadoop, and AWS services and Pyspark · Proficiency with Data Processing: HDFS, Hive, Spark, Python. · Strong analytic skills related to working with structured, semi structured and unstructured datasets. · Expertise in AWS cloud native services. · Good knowledge of any RDBMS/NoSQL database with strong SQL writing skills · Experience on Datawarehouse tools like Redshift · Experience in Deployment and migration of various workloads to Cloud services from traditional infrastructure or other Clouds. · Strong analytical and problem-solving capability · Excellent verbal and written communications skills · Ability to collaborate effectively across global teams &middo...
I am looking for a certified developer in the...a blocker to other work they may do on the project at a later time. Additional consulting resourcing would be necessary to support this group with AWS expertise Skills Needed: AWS Set-up of a Data Lake, AWS Tools (redshift, S3, Athena, glue), AWS Security, Higher Education implementation experience, Experience in the following higher education data domains Finance, Budget, Research, and HR. AWS Architect Certification is needed. The AWS Big Data certification is recommended for this position. Top Skills & + 3 years of Experience: AWS Set-up of a Data Lake AWS Tools (redshift, S3, Athena, glue) AWS Security AWS Architect Certification is needed Please start bid with "I am a US citizen or greencard holder", othe...
I am looking for a certified developer in the...a blocker to other work they may do on the project at a later time. Additional consulting resourcing would be necessary to support this group with AWS expertise Skills Needed: AWS Set-up of a Data Lake, AWS Tools (redshift, S3, Athena, glue), AWS Security, Higher Education implementation experience, Experience in the following higher education data domains Finance, Budget, Research, and HR. AWS Architect Certification is needed. The AWS Big Data certification is recommended for this position. Top Skills & + 3 years of Experience: AWS Set-up of a Data Lake AWS Tools (redshift, S3, Athena, glue) AWS Security AWS Architect Certification is needed Please start bid with "I am a US citizen or greencard holder", othe...
Need to understand something on AWS Redshift and hopefully create a small query/report from data we have in AWS Redshift
Hi, I have a simple 3D Geometry .FBX file that I made in Blender that I would like to have a Growth Simulation done in Houdini and rendered in Redshift. The geometry is very similar and nothing heavy, its just basically a decorated torus. You don't need to render it, I can do that (unless it's best if you do it) We can talk about this and how the animation would go! The .fbx file has all the colours and materials already. 240 frames. I can share with you everything if you are interested! Thank you!
Need someone who has good experience in spark,redshift,s3 and aws glue
Data Migration from RDBMS to AWS S3 and Redshift 1. Creating a framework that converts scripting languages like PLSQL, BTEQ etc to Python and PySpark to use Databricks as a compute. 2. A framework that converts the existing RDBMS scripts to Python or PySpark to readily use in AWS databricks compute. Need someone who had done this before or part of this. Should be able to give some used cases on how they implemented it. Main RDBMS being used is Teradata and BTEQ scripting.
Having expertise in AWS CLOUD • Designing and deploying dynamically scalable, available, fault-tolerant, and re...Cloud Watch, Lambda,Quick Sight,Red Shift. • Experience in Automation the AWS resources deployment using IAC (Terraform) • Code writing skills (python)for serverless Lambda • Experience in deploying Open VPN Cloud setup for Security • Monitoring infrastructure health, security using sass application (prowler ,cloud spoit ) • Designed the Dashboard using QuickSight using direct query with RDS & REDSHIFT • Selecting appropriate Cloud services to design and deploy an application based on given requirements • Implementing cost-control strategies • Understanding of application lifecycle management • Understanding in t...
Having expertise in AWS CLOUD • Designing and deploying dynamically scalable, available, fault-tolerant, and re...Cloud Watch, Lambda,Quick Sight,Red Shift. • Experience in Automation the AWS resources deployment using IAC (Terraform) • Code writing skills (python)for serverless Lambda • Experience in deploying Open VPN Cloud setup for Security • Monitoring infrastructure health, security using sass application (prowler ,cloud spoit ) • Designed the Dashboard using QuickSight using direct query with RDS & REDSHIFT • Selecting appropriate Cloud services to design and deploy an application based on given requirements • Implementing cost-control strategies • Understanding of application lifecycle management • Understanding in t...
...creating updates and maintaining the ETL jobs with same technology stack. If you are interested in technological innovations and are still looking for new sources of knowledge, we would like to welcome you on board. Check below what we offer and what we expect. Our requirements: 1. At least 2 to 3 years of relevant experience as Big Data Engineer, Understanding of MongoDB(NoSql database) and redshift database 2. Min 2 years of relevant hands-on Application Development experience into Scala with Spark framework Experience in building modern and scalable REST–based microservices using Scala with Spark framework. 3. Expertise with functional programming using Scala Experience in implementing RESTful web services in Scala, Experience into No SQL/ SQL databases. 4. Should have ...
We are looking for support on Data Engineer (AWS glue, Athena, Redshift, Python and Snowflake). We will give 23-25k per month
...to upload some project for learning purpose to AWS 1- Like create CI/ CD pipelines jekins etc 2- add some security features 3 - how to secure servers with multple staff logins 4 - Teach me how to create EC2 instances and other related concepts with practical 3 - S3 buckets and their policies - cloud formations - beanstalk - cloudfront - kinesis, SQS,SNS - Amazon dynamoDB, and other - Aurora - Redshift and other database practical - clooud watch,cloud trail 4 -Some microservices 5 - Docker containers 6 - and VPC concepts with practical so i i can build my confidence and learn faster as well - And few other services 7 - As i am mostly concentrating on python dont have much time to spend on AWS so with someone help i can make this process faster. Any idea how much would you charge ...
...stack that includes custom web crawlers hosted in AWS EC2 and S3, publishing applications in Snowflake and Redshift, and processing applications in AWS Redshift, AWS Glue, and Snowflake/Snowpipe. We use Sigma for data visualization because it is very easy to develop in, integrates extremely well with Snowflake, and can handle very large datasets with high performance. The application this role will build and run will need to track operations across this entire stack, including monitoring and alerting on operations parameters as well as data continuity at the field level. This position requires a combination of process management and development skills. Strong experience with both Redshift and Snowflake are required, as is experience building python applications...
Outcome expected: Build UI which should allow to select redshift schema (UI might have additional restrictions which schemas can be selected )which will be copied to S3 bucket in other environment. (here we have 2 redshift databases which are in two different env. (for Ex: A1 & A2). which doesn't have direct access to each other. So we should have to copy the schemas from A1 Redshift to A1 s3 bucket and A1 s3 bucket to A2 s3 bucket and then A2 s3 bucket to A2 redshift databases. By click of button we would to be able to initiate copy operation Every operation invocation must create audit record containing who performed operation when it happened, complete details of copy source and approval comments. Unload and Copy operation progress should be vie...
Data modeling for lending business. Loading data from multiple systems into AWS S3 buckets. Finally the data has to be loaded into Amazon Redshift
Hi, We are team of 13 developers, and we are expanding. We are looking for Machine Learning Engineer with 3+ years of experience. Your main role and responsibility is to build an algorithm from scratch or modify existing algorithm for our SaaS Product. This backend work is not a common backend API development. It has complex flow and process to make i...Python and common machine learning frameworks - Has a good mathematical and theoretical understanding of machine learning fundamentals - Has significant experience building and deploying machine learning applications at scale - Has a solid understanding of computer science fundamentals like algorithms You are good at: - Python - Machine Learning - Big data and ETL Pipeline (AWS Redshift) - AWS for Machine Learning ...
Can you create a Azure Data Factory pipeline which reads a parquet file from Blob Storage and writes into Redshift or Synapse or Snowflake Use Azure Databricks for basic Transformation. Blob Storage --> Azure Databricks - > Redshift
Create DaaS using structured data residing on Redshift. DaaS is a collection of template based reports with filters offered in different combinations to several subscription levels.
Need a technical author who has experience in writing on topics like AWS Azure GCP DigitalOcean Heroku Alibaba Linux Unix Windows Server (Active Directory) MySQL PostgreSQL SQL Server Oracle MongoDB Apache Cassandra Couchbase Neo4J DynamoDB Amazon Redshift Azure Synapse Google BigQuery Snowflake SQL Data Modelling ETL tools (Informatica, SSIS, Talend, Azure Data Factory, etc.) Data Pipelines Hadoop framework services (e.g. HDFS, Sqoop, Pig, Hive, Impala, Hbase, Flume, Zookeeper, etc.) Spark (EMR, Databricks etc.) Tableau PowerBI Artificial Intelligence Machine Learning Natural Language Processing Python C++ C# Java Ruby Golang Node.js JavaScript .NET Swift Android Shell scripting Powershell HTML5 AngularJS ReactJS VueJS Django Flask Git CI/CD (Jenkins, Bamboo, TeamCity, Octopus Depl...
--ROLE-- The AWS DevOps Engineer will be working closely with the founders of a startup to design and create an AWS cloud infrastructur...can come into our London office early on in the project to meet the team, that would be a bonus. However, we are also open to fully remote working for the right candidate. --RESPONSIBILITIES-- • Designing and implementing cloud infrastructure • Implementing the CI/CD pipeline preferably with GitHub • Security and performance • Networking --EXPERIENCE REQUIRED-- • AWS resources (RDS, DynamoDB, Redshift, Lambda, API Gateway, Event Bridge, EC2) • Big data infrastructure • Infrastructure as code with Terraform --DESIRABLE EXPERIENCE-- • Data lake and data warehouse --THE COMPANY-- Early stage startup driving...
i created this project (a project builds an **ELT pipeline** that extracts data from **S3**, stages them in **Redshift**, and transforms data into a set of **dimensional tables** for Sparkify analytics team to continue finding insights in what songs their users are listening to) it is very simple and it is all ready and done. but i have one issues: Unable to run Please address the issue noted below. The script, results in the below error: "screen shot". the project is attached
Create a Data pipeline using Airflow ( S3-->Data Bricks--->Redshift ). Composer
Can you create a sample Data Pipeline using Apache Airflow. Source: S3 Target: Redshift
Help needed with updating AWS cloudformation template (yaml) related to RedShift, secrets, glue, lambda, etc.
Hi I am a data scientist working in a travel company. I need help with my day to day tasks. So, this won't be a one time project but a daily support for my job. The tools I mainly work on - 1. Amazon Sagemaker 2. Amazon S3 and Redshift 3. Amazon Lambda 4. Google Colab notebooks And few others which we can discuss about later.
Hi need a python script which can pickup data from a sharepoint list and push it to a redshift table.
I need to move sample AdventureWork database from my SQL server to the AWS redshift or RDS using Airflow or kafka
Amazon Seller API integration with third-party tools Key Skills: SP API AWS - EC2 , S3, IAM Amazon Redshift AWS Lambada We would discuss project details with a more suitable candidate
Skill sets: Python programming language, Spark, Kafka / Kinesis, EMR / Glue, S3, Redshift, Airflow, Jenkins) Activities developers need to perform Load data in S3 Perform ETL functions in glue Data tiering on S3 Filter, join, aggregation Move data to Redshift EMR Apache Airflow for overall orchestrator They will use pipeline on Glue Python skill Move code from Dev to Production
Looking for Data Engineer Full time Experience- 5-8 Years Primary Skills- S3, AWS Redshift, Pyspark, AWS Glue, Python, SQL Working Days - Mon to Fri Shift- Indian Shift
AMI or AEIMS members ONLY Hi, A small team of 3D animators is seeking an additional C4D animator, preferably with XP and Redshift knowledge. Seeking a medical illustrator. Please only apply if you are a professional medical illustrator AMI or AEIMS and/or have a specific degree in medical illustration. Work is primarily dental and orthodontic. Established workflow and library. Casual and friendly work from home environment. Detailed 3D storyboard and Skype based assistance provided. Gig will likely last one to two years, could be longer. Prefer someone with a PC workstation with a 3000 series RTX GPU or several 2000 series GPUs. High speed internet recommended.
...promote them from Dev to QA to Prod Build and host API Services Build tooling to allow users to deploy Micro-services to production with a simple set of commands (Terraform) Profile: Bachelor's degree in Computer Science or related technical field. Software Engineer who is an expert in Cloud Infrastructure and Architecture (AWS) Experience with Amazon Web Services (AWS): Lambda, Amazon Redshift, Glue, EKS, Athena, API Gateway Programming experience with Unix/Shell scripting Hands-on experience with SQL and Python Hands-on experience with orchestration tools such as Airflow and DBT Strong problem-solving skills, research, and analytical thinking Poses solid troubleshooting skills Ability to work in a fast-paced and agile development environment Preferable exper...
You will find instructions in the PDF attached. For this data task, you are expected to write a piece of SQL code. If possible, stick to using Redshift SQL. Feel free to add any comments to your code to explain your procedure. process. The solving deadline is November 05th 10.00 AM (GMT+1)
This opening is for a stealth startup and is unpaid. It's only for experience and to help with developing a trillion dollar software idea! Currently seeking a part-time Full Stack Developer for this project. Contact only if you know the following requirements: We need to develop with a fast JavaScript framework! with Bun AWS S3, EC2, Redshift, Aurora, DocumentDB, DynamoDB and likewise Please message me for project details!
Skillset Needed: Defect resolution and production support of Big data ETL development using AWS native services Create data pipeline architecture by designing and implementing data ingestion solutions Integrate data sets using AWS services such as Glue, Lambda functions Design and optimize data models on AWS Cloud using AWS data stores such as Redshift, RDS, S3, Athena Author ETL processes using Python, Pyspark ETL process monitoring using Cloudwatch events You will be working in collaboration with other teams We are looking for a engineer to resolve these issues described below in our AWS environment. Enable paging through data returned from each API using the offset field. Delta Load enablement for Dimension tables (16), Fact tables(6), and Derived Tables(4) Go back in time a...
...years solid experience in AWS data architecture, and building data warehouse, data lake, ELT/ETL, business intelligence projects. - Expertise on AWS data modelling, capable to build conceptual, logical and physical data models faster. - Strong knowledge and experience on AWS data products like S3, Athena, Redshift, Glue, EMR, Lambda, etc... - Strong experience on designing and developing the data processing and storage in AWS Redshift. - Efficiently managing data operations on AWS Redshift and its clusters. - Handling the AWS S3 with efficient storage, partitions, and retrieval techniques. - Expertise in SQL and NoSQL concepts and has vast experience on RDBMS and NoSQL databases. - Experience on the Python Programming Languages. - Technical Expertise in ETL/ELT pipe...
I'm looking for a project based animator to render who will render 8 animations with a total of 1851 frames combined. These animations are created with C4D and utilize Redshift. The projects are 30 fps. The deadline and resolution is somewhat flexible. Please do discuss this with me beforehand as the deadline is flexible by 1 or 2 days and I am able to compromise on the resolution so that it is finished on time. If this goes well, then we can work together soon as I will have another set of animations for rendering right away. Please let me know if this is something you are able to do and what would the cost be?
Overview: We are looking for a full time developer with experience building high load / availability SASS platforms from the ground up and experience building data analytics and marketing software. Technical Requirements - Full Stack Developer ● Lever...developments in web applications and programming languages. Desired Competencies: • Expertise in functional programming using JavaScript (ES5, ES6) • Expertise in UI framework - Angular / React/Redux, RXJS • Preferred experience with a new generation of Web Programming - using Micro Service, REST / JSON, Component UI models • Preferred experience with AWS Cloud • Preferred experience with AWS RedShift or Postgres • Angular / React/Redux, RXJS, HTML, CSS, Javascript (ES5, ES6), Data visuali...
Requirements - Experience with creating S3 Datalake using Glue, Crawlers, Lakeformations - Experience with Redshift, DynamoDB, RDS, Aurora and Opensearch - Experience with querying SQL and NoSQL databases - Experience with creating relational data table schemas and partitions - Experience with building ETLs using Lambdas and Stepfunctions - Experience with big data processing via EMR and Spark - Experience with stream processing via Kinesis - Experience with CI/CD using CodePipeline, CodeCommit, CodeBuild - Experience with deployment framework such as AWS Cloudformation, CDK, Terraform and Serverless Framework Range of Experience - 3-5 years of related work experience building MLOps pipeline in AWS
Need Talend Big Data expert using Talend, AWS redshift and a snowflake integration
Hello! We are looking for a senior 3D animator who can dedicate full time on our project. You must have rich experience with motion graphic, Cinema 4D, Redshift, etc. Please don't apply if you are not C4D expert. Thanks!
Hello! We are looking for a senior 3D animator who can dedicate full time on our project. You must have rich experience with motion graphic, Cinema 4D, Redshift, etc. Please don't apply if you are not C4D expert. Thanks!
ETL data processing expert with MongoDB to Redshift