How to run Apache Flink locally?
Step by Step guide for local installation of Apache Flink
3 min readAug 30, 2020
To work with real-time stream processing(not micro-batching, real-time), Apache Flink is the next big thing. The documentation defines Apache Flink as:
Apache Flink is a framework for stateful computations over unbounded and bounded data streams.
Follow along to run Apache Flink locally.
Step 1: Download Apache Flink
- From the official website of Apache Flink, download the requisite binary. If you want the latest version, then according to your scala version requirements you can download either Apache Flink x.x.x for Scala 2.11 or Apache Flink x.x.x for Scala 2.12. As of August 30, 2020, Apache Flink 1.11.1 is the latest version.
- If you are looking for a specific version of Apache Flink, like working with Kinesis Data Analytics requires Apache Flink 1.8.2, then on the Downloads page, click on Flink under Stable Releases.
For the version, you want to install, click on Binaries
Then similar to above according to your scala version, download the binary.
Step 2: Unzip the downloaded binary
Step 3: Start Apache Flink Locally
- Go to bin
- For windows, click on start-cluster(Windows Batch File)
- For Linux, run the start-cluster shell script
bin/start-cluster.sh
Step 4: Connect to Web UI
If all goes well, you should be able to see Flink dashboard at http://localhost:8081
Viola!! Apache Flink is up and running.
Notes:
- Pre-requisites: Java & Scala installed.
- If you are looking to work with Apache Flink in Java, please download the requisite binary accordingly, all the other steps will be similar.
Until next time,
Ciao.