Skip to content

Commit

Permalink
[ZEPPELIN-5978] Remove support for old Flink 1.13 and 1.14 (#4688)
Browse files Browse the repository at this point in the history
* [ZEPPELIN-5978] Remove support for old Flink 1.13 and 1.14
  • Loading branch information
pan3793 authored Nov 9, 2023
1 parent 4352a10 commit 9dadde3
Show file tree
Hide file tree
Showing 25 changed files with 158 additions and 3,242 deletions.
18 changes: 4 additions & 14 deletions .github/workflows/core.yml
Original file line number Diff line number Diff line change
Expand Up @@ -216,7 +216,7 @@ jobs:
${{ runner.os }}-zeppelin-
- name: install environment
run: |
./mvnw install -DskipTests -Phadoop2 -Pintegration -pl zeppelin-interpreter-integration,zeppelin-web,spark-submit,spark/scala-2.12,spark/scala-2.13,markdown,flink-cmd,flink/flink-scala-2.11,flink/flink-scala-2.12,jdbc,shell -am -Pflink-114 ${MAVEN_ARGS}
./mvnw install -DskipTests -Phadoop2 -Pintegration -pl zeppelin-interpreter-integration,zeppelin-web,spark-submit,spark/scala-2.12,spark/scala-2.13,markdown,flink-cmd,flink/flink-scala-2.12,jdbc,shell -am -Pflink-117 ${MAVEN_ARGS}
./mvnw package -pl zeppelin-plugins -amd -DskipTests ${MAVEN_ARGS}
- name: Setup conda environment with python 3.7 and R
uses: conda-incubator/setup-miniconda@v2
Expand All @@ -243,7 +243,7 @@ jobs:
strategy:
fail-fast: false
matrix:
flink: [113, 114, 115, 116, 117]
flink: [115, 116, 117]
steps:
- name: Checkout
uses: actions/checkout@v3
Expand All @@ -265,13 +265,7 @@ jobs:
key: ${{ runner.os }}-zeppelin-${{ hashFiles('**/pom.xml') }}
restore-keys: |
${{ runner.os }}-zeppelin-
- name: install environment for flink before 1.15 (exclusive)
if: matrix.flink < '115'
run: |
./mvnw install -DskipTests -am -pl flink/flink-scala-2.11,flink/flink-scala-2.12,flink-cmd,zeppelin-interpreter-integration -Pflink-${{ matrix.flink }} -Phadoop2 -Pintegration ${MAVEN_ARGS}
./mvnw clean package -pl zeppelin-plugins -amd -DskipTests ${MAVEN_ARGS}
- name: install environment for flink after 1.15 (inclusive)
if: matrix.flink >= '115'
- name: install environment for flink
run: |
./mvnw install -DskipTests -am -pl flink/flink-scala-2.12,flink-cmd,zeppelin-interpreter-integration -Pflink-${{ matrix.flink }} -Phadoop2 -Pintegration ${MAVEN_ARGS}
./mvnw clean package -pl zeppelin-plugins -amd -DskipTests ${MAVEN_ARGS}
Expand All @@ -286,11 +280,7 @@ jobs:
channel-priority: true
auto-activate-base: false
use-mamba: true
- name: run tests for flink before 1.15 (exclusive)
if: matrix.flink < '115'
run: ./mvnw verify -pl flink/flink-scala-2.11,flink/flink-scala-2.12,flink-cmd,zeppelin-interpreter-integration -Pflink-${{ matrix.flink }} -Phadoop2 -Pintegration -DfailIfNoTests=false -Dtest=org.apache.zeppelin.flink.*Test,FlinkIntegrationTest${{ matrix.flink }} ${MAVEN_ARGS}
- name: run tests for flink after 1.15 (inclusive)
if: matrix.flink >= '115'
- name: run tests for flink
run: ./mvnw verify -pl flink/flink-scala-2.12,flink-cmd,zeppelin-interpreter-integration -Pflink-${{ matrix.flink }} -am -Phadoop2 -Pintegration -DfailIfNoTests=false -Dtest=org.apache.zeppelin.flink.*Test,FlinkIntegrationTest${{ matrix.flink }} ${MAVEN_ARGS}
- name: Print zeppelin logs
if: always()
Expand Down
7 changes: 7 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -67,6 +67,7 @@ zeppelin-web/yarn.lock
/warehouse/
/notebook/
/local-repo/
/notebook_*/

**/sessions/
**/data/
Expand Down Expand Up @@ -97,6 +98,9 @@ Thumbs.db
.idea/
*.iml

# Jetbrains Fleet project files
.fleet/

# vscode project files
.vscode/

Expand Down Expand Up @@ -132,3 +136,6 @@ tramp

# jEnv file
.java-version

# pyenv file
.python-version
12 changes: 4 additions & 8 deletions docs/interpreter/flink.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ limitations under the License.
[Apache Flink](https://flink.apache.org) is a framework and distributed processing engine for stateful computations over unbounded and bounded data streams.
Flink has been designed to run in all common cluster environments, perform computations at in-memory speed and at any scale.

In Zeppelin 0.9, we refactor the Flink interpreter in Zeppelin to support the latest version of Flink. **Only Flink 1.10+ is supported, old versions of flink won't work.**
In Zeppelin 0.9, we refactor the Flink interpreter in Zeppelin to support the latest version of Flink. **Currently, only Flink 1.15+ is supported, old versions of flink won't work.**
Apache Flink is supported in Zeppelin with the Flink interpreter group which consists of the five interpreters listed below.

<table class="table-configuration">
Expand Down Expand Up @@ -74,10 +74,6 @@ Apache Flink is supported in Zeppelin with the Flink interpreter group which con
<td>Support multiple versions of Flink</td>
<td>You can run different versions of Flink in one Zeppelin instance</td>
</tr>
<tr>
<td>Support multiple versions of Scala</td>
<td>You can run different Scala versions of Flink in on Zeppelin instance</td>
</tr>
<tr>
<td>Support multiple languages</td>
<td>Scala, Python, SQL are supported, besides that you can also collaborate across languages, e.g. you can write Scala UDF and use it in PyFlink</td>
Expand Down Expand Up @@ -142,11 +138,11 @@ docker run -u $(id -u) -p 8080:8080 --rm -v /mnt/disk1/flink-sql-cookbook-on-zep

## Prerequisites

Download Flink 1.10 or afterwards (Scala 2.11 & 2.12 are both supported)
Download Flink 1.15 or afterwards (Only Scala 2.12 is supported)

### Specific for Flink 1.15 and above
### Version-specific notes for Flink

Flink 1.15 is scala free and has changed its binary distribution. If you would like to make Zeppelin work with Flink 1.15, you need to do the following extra steps.
Flink 1.15 is scala free and has changed its binary distribution, the following extra steps is required.
* Move FLINK_HOME/opt/flink-table-planner_2.12-1.15.0.jar to FLINK_HOME/lib
* Move FLINK_HOME/lib/flink-table-planner-loader-1.15.0.jar to FLINK_HOME/opt
* Download flink-table-api-scala-bridge_2.12-1.15.0.jar and flink-table-api-scala_2.12-1.15.0.jar to FLINK_HOME/lib
Expand Down
33 changes: 7 additions & 26 deletions flink/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,45 +8,26 @@ This is the doc for Zeppelin developers who want to work on flink interpreter.

Flink interpreter is more complex than other interpreter (such as jdbc, shell). Currently it has following 8 modules
* flink-shims
* flink1.13-shims
* flink1.14-shims
* flink1.15-shims
* flink1.16-shims
* flink1.17-shims
* flink-scala-parent
* flink-scala-2.11
* flink-scala-2.12

The first 5 modules are to adapt different flink versions because there're some api changes between different versions of flink.
The first 4 modules are to adapt different flink versions because there're some api changes between different versions of flink.
`flink-shims` is parent module for other shims modules.
At runtime Flink interpreter will load the FlinkShims based on the current flink versions (See `FlinkShims#loadShims`).

The remaining 3 modules are to adapt different scala versions (Apache Flink supports 2 scala versions: 2.11 & 2.12).
`flink-scala-parent` is a parent module for `flink-scala-2.11` and `flink-scala-2.12`. It contains common code for both `flink-scala-2.11` and `flink-scala-2.12`.
There's symlink folder `flink-scala-parent` under `flink-scala-2.11` and `flink-scala-2.12`.
The remaining 2 modules are to adapt different scala versions (Apache Flink only supports Scala 2.12).
`flink-scala-parent` is a parent module for `flink-scala-2.12`.
There's symlink folder `flink-scala-parent` under `flink-scala-2.12`.
When you run maven command to build flink interpreter, the source code in `flink-scala-parent` won't be compiled directly, instead
they will be compiled against different scala versions when building `flink-scala-2.11` & `flink-scala-2.12`. (See `build-helper-maven-plugin` in `pom.xml`)
Both `flink-scala-2.11` and `flink-scala-2.12` build a flink interpreter jar and `FlinkInterpreterLauncher` in `zeppelin-plugins/launcher/flink` will choose the right jar based
they will be compiled against different scala versions when building `flink-scala-2.12`. (See `build-helper-maven-plugin` in `pom.xml`)
Both `flink-scala-2.12` build a flink interpreter jar and `FlinkInterpreterLauncher` in `zeppelin-plugins/launcher/flink` will choose the right jar based
on the scala version of flink.

### Work in IDE

Because of the complex project structure of flink interpreter, we need to do more configuration to make it work in IDE.
Here we take Intellij as an example (other IDE should be similar).

The key point is that we can only make flink interpreter work with one scala version at the same time in IDE.
So we have to disable the other module when working with one specific scala version module.

#### Make it work with scala-2.11

1. Exclude the source code folder (java/scala) of `flink-scala-parent` (Right click these folder -> Mark directory As -> Excluded)
2. Include the source code folder (java/scala) of `flink/flink-scala-2.11/flink-scala-parent` (Right click these folder -> Mark directory As -> Source root)

#### Make it work with scala-2.12

1. Exclude the source code folder (java/scala) of `flink-scala-parent` (Right click these folder -> Mark directory As -> Excluded)
2. Include the source code folder (java/scala) of `flink/flink-scala-2.12/flink-scala-parent` (Right click these folder -> Mark directory As -> Source root)


#### How to run unit test in IDE

Take `FlinkInterpreterTest` as an example, you need to specify environment variables `FLINK_HOME`, `FLINK_CONF_DIR`, `ZEPPELIN_HOME`.
Expand Down
1 change: 0 additions & 1 deletion flink/flink-scala-2.11/flink-scala-parent

This file was deleted.

83 changes: 0 additions & 83 deletions flink/flink-scala-2.11/pom.xml

This file was deleted.

This file was deleted.

Loading

0 comments on commit 9dadde3

Please sign in to comment.