Need advice about which tool to choose? Ask the StackShare community!
Airflow vs AWS Lambda: What are the differences?
Developers describe Airflow as " A platform to programmaticaly author, schedule and monitor data pipelines, by Airbnb ". Use Airflow to author workflows as directed acyclic graphs (DAGs) of tasks. The Airflow scheduler executes your tasks on an array of workers while following the specified dependencies. Rich command lines utilities makes performing complex surgeries on DAGs a snap. The rich user interface makes it easy to visualize pipelines running in production, monitor progress and troubleshoot issues when needed. On the other hand, AWS Lambda is detailed as " Automatically run code in response to modifications to objects in Amazon S3 buckets, messages in Kinesis streams, or updates in DynamoDB ". AWS Lambda is a compute service that runs your code in response to events and automatically manages the underlying compute resources for you. You can use AWS Lambda to extend other AWS services with custom logic, or create your own back-end services that operate at AWS scale, performance, and security.
Airflow belongs to "Workflow Manager" category of the tech stack, while AWS Lambda can be primarily classified under "Serverless / Task Processing" .
Some of the features offered by Airflow are:
- Dynamic: Airflow pipelines are configuration as code (Python), allowing for dynamic pipeline generation. This allows for writting code that instantiate pipelines dynamically.
- Extensible: Easily define your own operators, executors and extend the library so that it fits the level of abstraction that suits your environment.
- Elegant: Airflow pipelines are lean and explicit. Parameterizing your scripts is built in the core of Airflow using powerful Jinja templating engine.
On the other hand, AWS Lambda provides the following key features:
- Extend other AWS services with custom logic
- Build custom back-end services
- Completely Automated Administration
Airflow is an open source tool with 12.9K GitHub stars and 4.71K GitHub forks. Here's a link to Airflow's open source repository on GitHub.
9GAG , Asana , and CircleCI are some of the popular companies that use AWS Lambda, whereas Airflow is used by Airbnb , Slack , and 9GAG . AWS Lambda has a broader approval, being mentioned in 1022 company stacks & 612 developers stacks; compared to Airflow, which is listed in 72 company stacks and 33 developer stacks.
Need advice on what platform, systems and tools to use.
Evaluating whether to start a new digital business for which we will need to build a website that handles all traffic. Website only right now. May add smartphone apps later. No desktop app will ever be added. Website to serve various countries and languages. B2B and B2C type customers. Need to handle heavy traffic, be low cost, and scale well.
We are open to either build it on AWS or on Microsoft Azure .
Apologies if I'm leaving out some info. My first post. :) Thanks in advance!
I recommend this : -Spring reactive for back end : the fact it's reactive (async) it consumes half of the resources that a sync platform needs (so less CPU -> less money). -Angular : Web Front end ; it's gives you the possibility to use PWA which is a cheap replacement for a mobile app (but more less popular). -Docker images. -Kubernetes to orchestrate all the containers. -I Use Jenkins / blueocean, ansible for my CI/CD (with Github of course) -AWS of course : u can run a K8S cluster there, make it multi AZ (availability zones) to be highly available, use a load balancer and an auto scaler and ur good to go. -You can store data by taking any managed DB or u can deploy ur own (cheap but risky).
You pay less money, but u need some technical 2 - 3 guys to make that done.
Good luck
My advice will be Front end: React Backend: Language: Java, Kotlin. Database: SQL: Postgres, MySQL, Aurora NOSQL: Mongo db. Caching: Redis. Public : Spring Webflux for async public facing operation. Admin api: Spring boot, Hibrernate, Rest API. Build Container image. Kuberenetes: AWS EKS, AWS ECS, Google GKE. Use Jenkins for CI/CD pipeline. Buddy works is good for AWS. Static content: Host on AWS S3 bucket, Use Cloudfront or Cloudflare as CDN.
Serverless Solution: Api gateway Lambda, Serveless Aurora (SQL). AWS S3 bucket.
I am so confused. I need a tool that will allow me to go to about 10 different URLs to get a list of objects. Those object lists will be hundreds or thousands in length. I then need to get detailed data lists about each object. Those detailed data lists can have hundreds of elements that could be map/reduced somehow. My batch process dies sometimes halfway through which means hours of processing gone, i.e. time wasted. I need something like a directed graph that will keep results of successful data collection and allow me either pragmatically or manually to retry the failed ones some way (0 - forever) times. I want it to then process all the ones that have succeeded or been effectively ignored and load the data store with the aggregation of some couple thousand data-points. I know hitting this many endpoints is not a good practice but I can't put collectors on all the endpoints or anything like that. It is pretty much the only way to get the data.
For a non-streaming approach:
You could consider using more checkpoints throughout your spark jobs. Furthermore, you could consider separating your workload into multiple jobs with an intermittent data store (suggesting cassandra or you may choose based on your choice and availability) to store results , perform aggregations and store results of those.
Spark Job 1 - Fetch Data From 10 URLs and store data and metadata in a data store (cassandra) Spark Job 2..n - Check data store for unprocessed items and continue the aggregation
Alternatively for a streaming approach: Treating your data as stream might be useful also. Spark Streaming allows you to utilize a checkpoint interval - https://spark.apache.org/docs/latest/streaming-programming-guide.html#checkpointing
When adding a new feature to Checkly rearchitecting some older piece, I tend to pick Heroku for rolling it out. But not always, because sometimes I pick AWS Lambda . The short story:
- Developer Experience trumps everything.
- AWS Lambda is cheap. Up to a limit though. This impact not only your wallet.
- If you need geographic spread, AWS is lonely at the top.
Recently, I was doing a brainstorm at a startup here in Berlin on the future of their infrastructure. They were ready to move on from their initial, almost 100% Ec2 + Chef based setup. Everything was on the table. But we crossed out a lot quite quickly:
- Pure, uncut, self hosted Kubernetes — way too much complexity
- Managed Kubernetes in various flavors — still too much complexity
- Zeit — Maybe, but no Docker support
- Elastic Beanstalk — Maybe, bit old but does the job
- Heroku
- Lambda
It became clear a mix of PaaS and FaaS was the way to go. What a surprise! That is exactly what I use for Checkly! But when do you pick which model?
I chopped that question up into the following categories:
- Developer Experience / DX 🤓
- Ops Experience / OX 🐂 (?)
- Cost 💵
- Lock in 🔐
Read the full post linked below for all details
Pros of Airflow
-
50
-
14
-
12
-
12
-
10
-
6
-
5
-
5
-
3
-
3
-
3
-
2
Pros of AWS Lambda
-
128
-
82
-
69
-
58
-
47
-
11
-
6
-
6
-
6
-
6
-
5
-
3
Sign up to add or upvote pros Make informed product decisions
Cons of Airflow
-
2
-
2
-
1
-
1
Cons of AWS Lambda
-
6
-
2
-
0