Skip to content

Commit

Permalink
Merge branch 'apache:dev' into increase_e2e_disk_space
Browse files Browse the repository at this point in the history
  • Loading branch information
SbloodyS authored Jun 24, 2024
2 parents 112e418 + b94212b commit 2e0da5d
Show file tree
Hide file tree
Showing 190 changed files with 1,639 additions and 18,321 deletions.
53 changes: 53 additions & 0 deletions .github/workflows/owasp-dependency-check.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#

name: OWASP Dependency Check

on:
push:
branches:
- dev
pull_request:
paths:
- '**/pom.xml'

env:
MAVEN_OPTS: -Dmaven.wagon.httpconnectionManager.ttlSeconds=25 -Dmaven.wagon.http.retryHandler.count=3

jobs:
build:
runs-on: ubuntu-latest
timeout-minutes: 120
steps:
- uses: actions/checkout@v4
with:
submodules: true
- name: Set up JDK 8
uses: actions/setup-java@v4
with:
java-version: 8
distribution: 'adopt'
- name: Run OWASP Dependency Check
run: ./mvnw -B clean install dependency-check:check -DskipDependencyCheck==false -Dspotless.skip=true
- name: Upload report
uses: actions/upload-artifact@v4
if: ${{ cancelled() || failure() }}
continue-on-error: true
with:
name: dependency report
path: target/dependency-check-report.html
retention-days: 3
26 changes: 13 additions & 13 deletions .idea/vcs.xml

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

10 changes: 4 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,9 +43,9 @@
## 🚀 Abstract

----
<h4>Apache StreamPark is a stream processing development framework and professional management platform. </h4>
<h4>Apache StreamPark is a stream processing development framework and application management platform. </h4>

> Apache StreamPark is a streaming application development framework. Aimed at ease building and managing streaming applications, StreamPark provides development framework for writing stream processing application with Apache Flink and Apache Spark, More other engines will be supported in the future. Also, StreamPark is a professional management platform for streaming application, including application development, debugging, interactive query, deployment, operation, maintenance, etc. It was initially known as StreamX and renamed to StreamPark in August 2022.
> Apache StreamPark is a streaming application development framework. Aimed at ease building and managing streaming applications, StreamPark provides development framework for writing stream processing application with Apache Flink and Apache Spark, More other engines will be supported in the future. Also, StreamPark is a professional management platform for streaming application, including application development, debugging, interactive query, deployment, operation, maintenance, etc. It was initially known as StreamX and renamed to StreamPark in August 2022.
* Apache Flink & Apache Spark application development scaffold
* Support multiple versions of Apache Flink & Apache Spark
Expand All @@ -54,8 +54,6 @@
* Support catalog、olap、streaming-warehouse etc.
* ...

![](https://streampark.apache.org/image/sqlide.png)

## 🚀 QuickStart

- [Start with Docker](docker/README.md)
Expand All @@ -79,7 +77,7 @@ Download address for run-directly software package: https://streampark.apache.or

## 💋 Our users

Various companies and organizations use StreamPark for research, production and commercial products. Are you using this project? [Welcome to add your company](https://github.com/apache/incubator-streampark/issues/163)!
Various companies and organizations use Apache StreamPark for research, production and commercial products. Are you using this project? [Welcome to add your company](https://github.com/apache/incubator-streampark/issues/163)!

![Our users](https://streampark.apache.org/image/users.png)

Expand All @@ -100,7 +98,7 @@ We welcome your suggestions, comments (including criticisms), comments and contr
### 📤 Subscribe Mailing Lists
Mail List is the most recognized form of communication in Apache community. See how to [Join the Mailing Lists](https://streampark.apache.org/community/contribution_guide/mailing_lists)

Thank you to all the people who already contributed to StreamPark!
Thank you to all the people who already contributed to Apache StreamPark!

[![contrib graph](https://contrib.rocks/image?repo=apache/streampark)](https://github.com/apache/incubator-streampark/graphs/contributors)

Expand Down
7 changes: 0 additions & 7 deletions dist-material/release-docs/LICENSE
Original file line number Diff line number Diff line change
Expand Up @@ -577,10 +577,6 @@ The text of each license is the standard Apache 2.0 license. https://www.apache.
https://mvnrepository.com/artifact/org.pac4j/pac4j-oauth/4.5.7 Apache-2.0
https://mvnrepository.com/artifact/org.pac4j/pac4j-oidc/4.5.7 Apache-2.0
https://mvnrepository.com/artifact/io.fabric8/kubernetes-client/6.8.0 Apache-2.0
https://mvnrepository.com/artifact/dev.zio/zio_2.12/2.0.15 Apache-2.0
https://mvnrepository.com/artifact/dev.zio/zio-streams_2.12/2.0.15 Apache-2.0
https://mvnrepository.com/artifact/dev.zio/zio-concurrent_2.12/2.0.15 Apache-2.0
https://mvnrepository.com/artifact/dev.zio/zio-http_2.12/3.0.0-RC2 Apache-2.0
https://maven.apache.org/wrapper Apache-2.0
mvnw files from https://github.com/apache/maven-wrapper Apache 2.0
streampark-console/streampark-console-service/src/main/assembly/bin/setclasspath.sh from https://github.com/apache/tomcat
Expand Down Expand Up @@ -727,9 +723,6 @@ The text of each license is also included in licenses/LICENSE-[project].txt.
https://mvnrepository.com/artifact/org.slf4j/slf4j-api/1.7.30 MIT
https://mvnrepository.com/artifact/org.projectlombok/lombok/1.18.24 MIT
https://mvnrepository.com/artifact/com.auth0/java-jwt/4.0.0 MIT
https://mvnrepository.com/artifact/com.lihaoyi/pprint_2.12/0.8.1 MIT
https://mvnrepository.com/artifact/com.lihaoyi/os-lib_2.12/0.8.1 MIT
https://mvnrepository.com/artifact/com.lihaoyi/upickle_2.12/0.8.1 MIT
https://mvnrepository.com/artifact/com.github.scribejava/scribejava-apis/7.1.1 MIT
https://mvnrepository.com/artifact/com.github.scribejava/scribejava-core/7.1.1 MIT

Expand Down
34 changes: 3 additions & 31 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -123,8 +123,6 @@
<commons-lang3.version>3.8.1</commons-lang3.version>
<enumeratum.version>1.6.1</enumeratum.version>
<assertj.version>3.23.1</assertj.version>
<zio.version>2.0.15</zio.version>
<zio-logging.version>2.1.13</zio-logging.version>
<pprint.version>0.8.1</pprint.version>

<maven-compiler-plugin.version>3.10.1</maven-compiler-plugin.version>
Expand All @@ -138,7 +136,7 @@
<maven-apache-rat-plugin.version>0.13</maven-apache-rat-plugin.version>
<spotless.scalafmt.version>3.4.3</spotless.scalafmt.version>
<maven-checkstyle-plugin.version>3.2.0</maven-checkstyle-plugin.version>
<owasp-dependency-check-maven.version>8.2.1</owasp-dependency-check-maven.version>
<owasp-dependency-check-maven.version>9.2.0</owasp-dependency-check-maven.version>
<build-helper-maven-plugin.version>3.3.0</build-helper-maven-plugin.version>
<streampark.shaded.package>org.apache.streampark.shaded</streampark.shaded.package>
<flink.table.uber.artifact.id>flink-table-uber_${scala.binary.version}</flink.table.uber.artifact.id>
Expand Down Expand Up @@ -443,33 +441,6 @@
<version>${assertj.version}</version>
<scope>test</scope>
</dependency>

<!-- ZIO -->
<dependency>
<groupId>dev.zio</groupId>
<artifactId>zio-logging_${scala.binary.version}</artifactId>
<version>${zio-logging.version}</version>
</dependency>

<dependency>
<groupId>dev.zio</groupId>
<artifactId>zio-streams_${scala.binary.version}</artifactId>
<version>${zio.version}</version>
</dependency>

<dependency>
<groupId>dev.zio</groupId>
<artifactId>zio-concurrent_${scala.binary.version}</artifactId>
<version>${zio.version}</version>
</dependency>

<!-- pprint -->
<dependency>
<groupId>com.lihaoyi</groupId>
<artifactId>pprint_${scala.binary.version}</artifactId>
<version>${pprint.version}</version>
</dependency>

</dependencies>

</dependencyManagement>
Expand Down Expand Up @@ -783,9 +754,10 @@
<version>${owasp-dependency-check-maven.version}</version>
<configuration>
<skip>${skipDependencyCheck}</skip>
<format>ALL</format>
<skipProvidedScope>true</skipProvidedScope>
<skipRuntimeScope>true</skipRuntimeScope>
<skipSystemScope>true</skipSystemScope>
<failBuildOnCVSS>7</failBuildOnCVSS>
</configuration>
<executions>
<execution>
Expand Down
29 changes: 0 additions & 29 deletions streampark-common/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -142,34 +142,6 @@
<artifactId>streampark-shaded-jackson</artifactId>
<version>${streampark.shaded.version}</version>
</dependency>


<!-- ZIO -->
<dependency>
<groupId>dev.zio</groupId>
<artifactId>zio-logging_${scala.binary.version}</artifactId>
<optional>true</optional>
</dependency>

<dependency>
<groupId>dev.zio</groupId>
<artifactId>zio-streams_${scala.binary.version}</artifactId>
<optional>true</optional>
</dependency>

<dependency>
<groupId>dev.zio</groupId>
<artifactId>zio-concurrent_${scala.binary.version}</artifactId>
<optional>true</optional>
</dependency>

<!-- pprint -->
<dependency>
<groupId>com.lihaoyi</groupId>
<artifactId>pprint_${scala.binary.version}</artifactId>
<optional>true</optional>
</dependency>

</dependencies>

<build>
Expand Down Expand Up @@ -200,7 +172,6 @@
<artifactSet>
<includes>
<include>com.beachape:*</include>
<include>com.lihaoyi:*</include>
<include>io.netty:netty-resolver</include>
</includes>
</artifactSet>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,18 +20,7 @@ package org.apache.streampark.common.conf
/** Flink kubernetes Configuration for v1 version */
object K8sFlinkConfig {

lazy val isV2Enabled: Boolean = InternalConfigHolder.get(ENABLE_V2)

val ENABLE_V2: InternalOption = InternalOption(
key = "streampark.flink-k8s.enable-v2",
defaultValue = false,
classType = classOf[java.lang.Boolean],
description =
"Whether to enable the v2 version(base on flink-kubernetes-operator) of flink kubernetes operation"
)

// ======= deprecated =======

@Deprecated
val jobStatusTrackTaskTimeoutSec: InternalOption = InternalOption(
key = "streampark.flink-k8s.tracking.polling-task-timeout-sec.job-status",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
package org.apache.streampark.common.util

import java.text.{ParseException, SimpleDateFormat}
import java.time.{Duration, LocalDateTime}
import java.time.{Duration, LocalDateTime, ZoneId}
import java.time.format.DateTimeFormatter
import java.util._
import java.util.concurrent.TimeUnit
Expand Down Expand Up @@ -173,6 +173,13 @@ object DateUtils {
builder.toString
}

def toSecondDuration(time1: Date, time2: Date = new Date()): Long = {
val startDateTime = LocalDateTime.ofInstant(time1.toInstant, ZoneId.systemDefault());
val endDateTime = LocalDateTime.ofInstant(time2.toInstant, ZoneId.systemDefault());
val duration = Duration.between(startDateTime, endDateTime)
duration.toMillis / 1000
}

def getTimeUnit(
time: String,
default: (Int, TimeUnit) = (5, TimeUnit.SECONDS)): (Int, TimeUnit) = {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ import java.util
import java.util.{Timer, TimerTask}
import java.util.concurrent._

import scala.collection.JavaConversions._
import scala.collection.convert.ImplicitConversions._
import scala.util.{Failure, Success, Try}

object HadoopUtils extends Logger {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,6 @@ import java.util.{HashMap => JavaMap, LinkedHashMap => JavaLinkedMap, Properties
import java.util.concurrent.atomic.AtomicInteger
import java.util.regex.Pattern

import scala.collection.JavaConverters._
import scala.collection.convert.ImplicitConversions._
import scala.collection.mutable
import scala.collection.mutable.{ArrayBuffer, Map => MutableMap}
Expand Down Expand Up @@ -203,31 +202,31 @@ object PropertiesUtils extends Logger {
}

def fromYamlTextAsJava(text: String): JavaMap[String, String] =
new JavaMap[String, String](fromYamlText(text).asJava)
new JavaMap[String, String](fromYamlText(text))

def fromHoconTextAsJava(text: String): JavaMap[String, String] =
new JavaMap[String, String](fromHoconText(text).asJava)
new JavaMap[String, String](fromHoconText(text))

def fromPropertiesTextAsJava(text: String): JavaMap[String, String] =
new JavaMap[String, String](fromPropertiesText(text).asJava)
new JavaMap[String, String](fromPropertiesText(text))

def fromYamlFileAsJava(filename: String): JavaMap[String, String] =
new JavaMap[String, String](fromYamlFile(filename).asJava)
new JavaMap[String, String](fromYamlFile(filename))

def fromHoconFileAsJava(filename: String): JavaMap[String, String] =
new JavaMap[String, String](fromHoconFile(filename).asJava)
new JavaMap[String, String](fromHoconFile(filename))

def fromPropertiesFileAsJava(filename: String): JavaMap[String, String] =
new JavaMap[String, String](fromPropertiesFile(filename).asJava)
new JavaMap[String, String](fromPropertiesFile(filename))

def fromYamlFileAsJava(inputStream: InputStream): JavaMap[String, String] =
new JavaMap[String, String](fromYamlFile(inputStream).asJava)
new JavaMap[String, String](fromYamlFile(inputStream))

def fromHoconFileAsJava(inputStream: InputStream): JavaMap[String, String] =
new JavaMap[String, String](fromHoconFile(inputStream).asJava)
new JavaMap[String, String](fromHoconFile(inputStream))

def fromPropertiesFileAsJava(inputStream: InputStream): JavaMap[String, String] =
new JavaMap[String, String](fromPropertiesFile(inputStream).asJava)
new JavaMap[String, String](fromPropertiesFile(inputStream))

/**
* @param file
Expand Down Expand Up @@ -370,13 +369,13 @@ object PropertiesUtils extends Logger {
}

@Nonnull def extractDynamicPropertiesAsJava(properties: String): JavaMap[String, String] =
new JavaMap[String, String](extractDynamicProperties(properties).asJava)
new JavaMap[String, String](extractDynamicProperties(properties))

@Nonnull def extractMultipleArgumentsAsJava(
args: Array[String]): JavaMap[String, JavaMap[String, String]] = {
val map =
extractMultipleArguments(args).map(c => c._1 -> new JavaMap[String, String](c._2.asJava))
new JavaMap[String, JavaMap[String, String]](map.asJava)
extractMultipleArguments(args).map(c => c._1 -> new JavaMap[String, String](c._2))
new JavaMap[String, JavaMap[String, String]](map)
}

}
Loading

0 comments on commit 2e0da5d

Please sign in to comment.