diff --git a/.gitignore b/.gitignore index ec394ab09063a..ff9e964289f37 100644 --- a/.gitignore +++ b/.gitignore @@ -5,6 +5,13 @@ *.ipr *.iws build-idea/ +out/ + +# include shared intellij config +!.idea/scopes/x_pack.xml +!.idea/inspectionProfiles/Project_Default.xml +!.idea/runConfigurations/Debug_Elasticsearch.xml + # These files are generated in the main tree by IntelliJ benchmarks/src/main/generated/* diff --git a/.idea/inspectionProfiles/Project_Default.xml b/.idea/inspectionProfiles/Project_Default.xml new file mode 100644 index 0000000000000..5cf789707c58c --- /dev/null +++ b/.idea/inspectionProfiles/Project_Default.xml @@ -0,0 +1,9 @@ + + + + \ No newline at end of file diff --git a/.idea/runConfigurations/Debug_Elasticsearch.xml b/.idea/runConfigurations/Debug_Elasticsearch.xml new file mode 100644 index 0000000000000..685c59c1846a4 --- /dev/null +++ b/.idea/runConfigurations/Debug_Elasticsearch.xml @@ -0,0 +1,11 @@ + + + + \ No newline at end of file diff --git a/.idea/scopes/x_pack.xml b/.idea/scopes/x_pack.xml new file mode 100644 index 0000000000000..1ccf16614d596 --- /dev/null +++ b/.idea/scopes/x_pack.xml @@ -0,0 +1,3 @@ + + + diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index eeea7b412cbf8..e6764b505d5f3 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -103,17 +103,14 @@ be used to test against other JDKs as well, this is not only limited to JDK 11. > Note: It is also required to have `JAVA8_HOME`, `JAVA9_HOME`, `JAVA10_HOME` and `JAVA11_HOME`, and `JAVA12_HOME` available so that the tests can pass. -> Warning: do not use `sdkman` for Java installations which do not have proper -`jrunscript` for jdk distributions. - Elasticsearch uses the Gradle wrapper for its build. You can execute Gradle using the wrapper via the `gradlew` script on Unix systems or `gradlew.bat` script on Windows in the root of the repository. The examples below show the usage on Unix. -We support development in the Eclipse and IntelliJ IDEs. -For Eclipse, the minimum version that we support is [4.13][eclipse]. -For IntelliJ, the minimum version that we support is [IntelliJ 2017.2][intellij]. +We support development in IntelliJ versions IntelliJ 2019.2 and +onwards. We would like to support Eclipse, but few of us use it and has fallen +into [disrepair][eclipse]. [Docker](https://docs.docker.com/install/) is required for building some Elasticsearch artifacts and executing certain test suites. You can run Elasticsearch without building all the artifacts with: @@ -123,48 +120,21 @@ You can access Elasticsearch with: curl -u elastic:password localhost:9200 -### Configuring IDEs And Running Tests - -Eclipse users can automatically configure their IDE: `./gradlew eclipse` -then `File: Import: Gradle : Existing Gradle Project`. -Additionally you will want to ensure that Eclipse is using 2048m of heap by modifying -`eclipse.ini` accordingly to avoid GC overhead and OOM errors. - -IntelliJ users can automatically configure their IDE: `./gradlew idea` -then `File->New Project From Existing Sources`. Point to the root of -the source directory, select -`Import project from external model->Gradle`, enable -`Use auto-import`. In order to run tests directly from -IDEA 2017.2 and above, it is required to disable the IDEA run launcher in order to avoid -`idea_rt.jar` causing "jar hell". This can be achieved by adding the -`-Didea.no.launcher=true` [JVM -option](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties). -Alternatively, `idea.no.launcher=true` can be set in the -[`idea.properties`](https://www.jetbrains.com/help/idea/file-idea-properties.html) -file which can be accessed under Help > Edit Custom Properties (this will require a -restart of IDEA). For IDEA 2017.3 and above, in addition to the JVM option, you will need to go to -`Run->Edit Configurations->...->Defaults->JUnit` and verify that the `Shorten command line` setting is set to -`user-local default: none`. You may also need to [remove `ant-javafx.jar` from your -classpath](https://github.com/elastic/elasticsearch/issues/14348) if that is -reported as a source of jar hell. - -To run an instance of elasticsearch from the source code run `./gradlew run` - -The Elasticsearch codebase makes heavy use of Java `assert`s and the -test runner requires that assertions be enabled within the JVM. This -can be accomplished by passing the flag `-ea` to the JVM on startup. - -For IntelliJ, go to -`Run->Edit Configurations...->Defaults->JUnit->VM options` and input -`-ea`. - -For Eclipse, go to `Preferences->Java->Installed JREs` and add `-ea` to -`VM Arguments`. - -Some tests related to locale testing also require the flag -`-Djava.locale.providers` to be set. Set the VM options/VM arguments for -IntelliJ or Eclipse like describe above to use -`-Djava.locale.providers=SPI,COMPAT`. +### Importing the project into IntelliJ IDEA + +Elasticsearch builds using Java 13. When importing into IntelliJ you will need +to define an appropriate SDK. The convention is that **this SDK should be named +"13"** so that the project import will detect it automatically. For more details +on defining an SDK in IntelliJ please refer to [their documentation](https://www.jetbrains.com/help/idea/sdk.html#define-sdk). +SDK definitions are global, so you can add the JDK from any project, or after +project import. Importing with a missing JDK will still work, IntelliJ will +simply report a problem and will refuse to build until resolved. + +You can import the Elasticsearch project into IntelliJ IDEA via: + + - Select **File > Open** + - In the subsequent dialog navigate to the root `build.gradle` file + - In the subsequent dialog select **Open as Project** ### REST Endpoint Conventions @@ -212,14 +182,7 @@ Please follow these formatting guidelines: part of a file. Please format such sections sympathetically with the rest of the code, while keeping lines to maximum length of 76 characters. * Wildcard imports (`import foo.bar.baz.*`) are forbidden and will cause - the build to fail. This can be done automatically by your IDE: - * Eclipse: `Preferences->Java->Code Style->Organize Imports`. There are - two boxes labeled "`Number of (static )? imports needed for .*`". Set - their values to 99999 or some other absurdly high value. - * IntelliJ: `Preferences/Settings->Editor->Code Style->Java->Imports`. - There are two configuration options: `Class count to use import with - '*'` and `Names count to use static import with '*'`. Set their values - to 99999 or some other absurdly high value. + the build to fail. * If *absolutely* necessary, you can disable formatting for regions of code with the `// tag::NAME` and `// end::NAME` directives, but note that these are intended for use in documentation, so please make it clear what @@ -234,9 +197,6 @@ Please follow these formatting guidelines: #### Editor / IDE Support -Eclipse IDEs can import the file [.eclipseformat.xml] -directly. - IntelliJ IDEs can [import](https://blog.jetbrains.com/idea/2014/01/intellij-idea-13-importing-code-formatter-settings-from-eclipse/) the same settings file, and / or use the [Eclipse Code @@ -393,26 +353,9 @@ It is important that the only code covered by the Elastic licence is contained within the top-level `x-pack` directory. The build will fail its pre-commit checks if contributed code does not have the appropriate license headers. -You may find it helpful to configure your IDE to automatically insert the -appropriate license header depending on the part of the project to which you are -contributing. - -#### IntelliJ: Copyright & Scope Profiles - -To have IntelliJ insert the correct license, it is necessary to create to copyright profiles. -These may potentially be called `apache2` and `commercial`. These can be created in -`Preferences/Settings->Editor->Copyright->Copyright Profiles`. To associate these profiles to -their respective directories, two "Scopes" will need to be created. These can be created in -`Preferences/Settings->Appearances & Behavior->Scopes`. When creating scopes, be sure to choose -the `shared` scope type. Create a scope, `apache2`, with -the associated pattern of `!file[group:x-pack]:*/`. This pattern will exclude all the files contained in -the `x-pack` directory. The other scope, `commercial`, will have the inverse pattern of `file[group:x-pack]:*/`. -The two scopes, together, should account for all the files in the project. To associate the scopes -with their copyright-profiles, go into `Preferences/Settings->Editor>Copyright` and use the `+` to add -the associations `apache2/apache2` and `commercial/commercial`. - -Configuring these options in IntelliJ can be quite buggy, so do not be alarmed if you have to open/close -the settings window and/or restart IntelliJ to see your changes take effect. +> **NOTE:** If you have imported the project into IntelliJ IDEA the project will +> be automatically configured to add the correct license header to new source +> files based on the source location. ### Creating A Distribution @@ -425,7 +368,7 @@ cd elasticsearch/ To build a darwin-tar distribution, run this command: ```sh -./gradlew -p distribution/archives/darwin-tar assemble --parallel +./gradlew -p distribution/archives/darwin-tar assemble ``` You will find the distribution under: @@ -435,9 +378,12 @@ To create all build artifacts (e.g., plugins and Javadocs) as well as distributions in all formats, run this command: ```sh -./gradlew assemble --parallel +./gradlew assemble ``` +> **NOTE:** Running the task above will fail if you don't have a available +> Docker installation. + The package distributions (Debian and RPM) can be found under: `./distribution/packages/(deb|rpm|oss-deb|oss-rpm)/build/distributions/` @@ -570,10 +516,6 @@ known as "transitive" dependencies". should not be shipped with the project because it is "provided" by the runtime somehow. Elasticsearch plugins use this configuration to include dependencies that are bundled with Elasticsearch's server. -
`bundle`
Only available in projects with the shadow plugin, -dependencies with this configuration are bundled into the jar produced by the -build. Since IDEs do not understand this configuration we rig them to treat -dependencies in this configuration as `compile` dependencies.
`testCompile`
Code that is on the classpath for compiling tests that are part of this project but not production code. The canonical example of this is `junit`.
@@ -608,6 +550,5 @@ Finally, we require that you run `./gradlew check` before submitting a non-documentation contribution. This is mentioned above, but it is worth repeating in this section because it has come up in this context. -[eclipse]: https://download.eclipse.org/eclipse/downloads/drops4/R-4.13-201909161045/ [intellij]: https://blog.jetbrains.com/idea/2017/07/intellij-idea-2017-2-is-here-smart-sleek-and-snappy/ -[shadow-plugin]: https://github.com/johnrengelman/shadow +[eclipse]: https://github.com/elastic/elasticsearch/issues/53664 diff --git a/TESTING.asciidoc b/TESTING.asciidoc index 6c3ca01c7ae72..fe04cc49637ba 100644 --- a/TESTING.asciidoc +++ b/TESTING.asciidoc @@ -20,9 +20,9 @@ To create a platform-specific build including the x-pack modules, use the following depending on your operating system: ----------------------------- -./gradlew :distribution:archives:linux-tar:assemble --parallel -./gradlew :distribution:archives:darwin-tar:assemble --parallel -./gradlew :distribution:archives:windows-zip:assemble --parallel +./gradlew :distribution:archives:linux-tar:assemble +./gradlew :distribution:archives:darwin-tar:assemble +./gradlew :distribution:archives:windows-zip:assemble ----------------------------- === Running Elasticsearch from a checkout @@ -51,6 +51,10 @@ recommended to configure the IDE to initiate multiple listening attempts. In cas is called "Auto restart" and needs to be checked. In case of Eclipse, "Connection limit" setting needs to be configured with a greater value (ie 10 or more). +NOTE: If you have imported the project into IntelliJ according to the instructions in +link:/CONTRIBUTING.md#importing-the-project-into-intellij-idea[CONTRIBUTING.md] then a debug run configuration +named "Debug Elasticsearch" will be created for you and configured appropriately. + ==== Distribution By default a node is started with the zip distribution. diff --git a/build.gradle b/build.gradle index 2dbf94ae7dabb..a23614bafa975 100644 --- a/build.gradle +++ b/build.gradle @@ -17,20 +17,21 @@ * under the License. */ - import com.avast.gradle.dockercompose.tasks.ComposePull import com.github.jengelman.gradle.plugins.shadow.ShadowPlugin +import de.thetaphi.forbiddenapis.gradle.ForbiddenApisPlugin import org.apache.tools.ant.taskdefs.condition.Os import org.elasticsearch.gradle.BuildPlugin import org.elasticsearch.gradle.BwcVersions import org.elasticsearch.gradle.Version import org.elasticsearch.gradle.VersionProperties import org.elasticsearch.gradle.plugin.PluginBuildPlugin +import org.gradle.plugins.ide.eclipse.model.AccessRule import org.gradle.plugins.ide.eclipse.model.SourceFolder import org.gradle.util.DistributionLocator import org.gradle.util.GradleVersion -import static org.elasticsearch.gradle.tool.Boilerplate.maybeConfigure +import static org.elasticsearch.gradle.util.GradleUtils.maybeConfigure plugins { id 'lifecycle-base' @@ -43,6 +44,7 @@ apply plugin: 'nebula.info-scm' apply from: 'gradle/build-scan.gradle' apply from: 'gradle/build-complete.gradle' apply from: 'gradle/runtime-jdk-provision.gradle' +apply from: 'gradle/ide.gradle' // common maven publishing configuration allprojects { @@ -180,7 +182,6 @@ allprojects { System.getProperty("eclipse.application") != null || // Detects gradle launched from the Eclipse compiler server gradle.startParameter.taskNames.contains('eclipse') || // Detects gradle launched from the command line to do eclipse stuff gradle.startParameter.taskNames.contains('cleanEclipse') - isIdea = System.getProperty("idea.active") != null || gradle.startParameter.taskNames.contains('idea') || gradle.startParameter.taskNames.contains('cleanIdea') // for BWC testing bwcVersions = versions @@ -356,36 +357,6 @@ gradle.projectsEvaluated { } } -// intellij configuration -allprojects { - apply plugin: 'idea' - - if (isIdea) { - project.buildDir = file('build-idea') - } - idea { - module { - inheritOutputDirs = false - outputDir = file('build-idea/classes/main') - testOutputDir = file('build-idea/classes/test') - - // also ignore other possible build dirs - excludeDirs += file('build') - excludeDirs += file('build-eclipse') - } - } - - tasks.named('cleanIdea') { - delete 'build-idea' - } -} - -idea { - project { - vcs = 'Git' - } -} - // eclipse configuration allprojects { apply plugin: 'eclipse' @@ -408,6 +379,21 @@ allprojects { } } } + /* + * Allow accessing com/sun/net/httpserver in projects that have + * configured forbidden apis to allow it. + */ + plugins.withType(ForbiddenApisPlugin) { + eclipse.classpath.file.whenMerged { classpath -> + if (false == forbiddenApisTest.bundledSignatures.contains('jdk-non-portable')) { + classpath.entries + .findAll { it.kind == "con" && it.toString().contains("org.eclipse.jdt.launching.JRE_CONTAINER") } + .each { + it.accessRules.add(new AccessRule("accessible", "com/sun/net/httpserver/*")) + } + } + } + } File licenseHeaderFile String prefix = ':x-pack' @@ -559,3 +545,15 @@ allprojects { } } } + +// TODO: remove this once 7.7 is released and the 7.x branch is 7.8 +subprojects { + pluginManager.withPlugin('elasticsearch.testclusters') { + testClusters.all { + if (org.elasticsearch.gradle.info.BuildParams.isSnapshotBuild() == false) { + systemProperty 'es.itv2_feature_flag_registered', 'true' + systemProperty 'es.datastreams_feature_flag_registered', 'true' + } + } + } +} diff --git a/buildSrc/build.gradle b/buildSrc/build.gradle index 7ffe1490eed71..9b82bf3c5ce6a 100644 --- a/buildSrc/build.gradle +++ b/buildSrc/build.gradle @@ -187,8 +187,10 @@ if (project != rootProject) { distribution project(':distribution:archives:oss-windows-zip') distribution project(':distribution:archives:darwin-tar') distribution project(':distribution:archives:oss-darwin-tar') + distribution project(':distribution:archives:linux-aarch64-tar') distribution project(':distribution:archives:linux-tar') distribution project(':distribution:archives:oss-linux-tar') + distribution project(':distribution:archives:oss-linux-aarch64-tar') } // for external projects we want to remove the marker file indicating we are running the Elasticsearch project diff --git a/buildSrc/src/main/groovy/org/elasticsearch/gradle/BuildPlugin.groovy b/buildSrc/src/main/groovy/org/elasticsearch/gradle/BuildPlugin.groovy index 8b8c088c86244..7580671cc4cd6 100644 --- a/buildSrc/src/main/groovy/org/elasticsearch/gradle/BuildPlugin.groovy +++ b/buildSrc/src/main/groovy/org/elasticsearch/gradle/BuildPlugin.groovy @@ -32,7 +32,7 @@ import org.elasticsearch.gradle.precommit.PrecommitTasks import org.elasticsearch.gradle.test.ErrorReportingTestListener import org.elasticsearch.gradle.testclusters.ElasticsearchCluster import org.elasticsearch.gradle.testclusters.TestClustersPlugin -import org.elasticsearch.gradle.tool.Boilerplate +import org.elasticsearch.gradle.util.GradleUtils import org.gradle.api.Action import org.gradle.api.GradleException import org.gradle.api.InvalidUserDataException @@ -46,6 +46,7 @@ import org.gradle.api.artifacts.Dependency import org.gradle.api.artifacts.ModuleDependency import org.gradle.api.artifacts.ProjectDependency import org.gradle.api.artifacts.dsl.RepositoryHandler +import org.gradle.api.artifacts.repositories.ExclusiveContentRepository import org.gradle.api.artifacts.repositories.IvyArtifactRepository import org.gradle.api.artifacts.repositories.IvyPatternRepositoryLayout import org.gradle.api.artifacts.repositories.MavenArtifactRepository @@ -81,7 +82,7 @@ import org.gradle.util.GradleVersion import java.nio.charset.StandardCharsets import java.nio.file.Files -import static org.elasticsearch.gradle.tool.Boilerplate.maybeConfigure +import static org.elasticsearch.gradle.util.GradleUtils.maybeConfigure /** * Encapsulates build configuration for elasticsearch projects. @@ -147,7 +148,7 @@ class BuildPlugin implements Plugin { File securityPolicy = buildResources.copy("fips_java.policy") File bcfksKeystore = buildResources.copy("cacerts.bcfks") // This configuration can be removed once system modules are available - Boilerplate.maybeCreate(project.configurations, 'extraJars') { + GradleUtils.maybeCreate(project.configurations, 'extraJars') { project.dependencies.add('extraJars', "org.bouncycastle:bc-fips:1.0.1") project.dependencies.add('extraJars', "org.bouncycastle:bctls-fips:1.0.9") } @@ -234,7 +235,7 @@ class BuildPlugin implements Plugin { static String getJavaHome(final Task task, final int version) { requireJavaHome(task, version) JavaHome java = BuildParams.javaVersions.find { it.version == version } - return java == null ? null : java.javaHome.absolutePath + return java == null ? null : java.javaHome.get().absolutePath } /** @@ -327,10 +328,16 @@ class BuildPlugin implements Plugin { // extract the revision number from the version with a regex matcher List matches = (luceneVersion =~ /\w+-snapshot-([a-z0-9]+)/).getAt(0) as List String revision = matches.get(1) - repos.maven { MavenArtifactRepository repo -> + MavenArtifactRepository luceneRepo = repos.maven { MavenArtifactRepository repo -> repo.name = 'lucene-snapshots' repo.url = "https://s3.amazonaws.com/download.elasticsearch.org/lucenesnapshots/${revision}" } + repos.exclusiveContent { ExclusiveContentRepository exclusiveRepo -> + exclusiveRepo.filter { + it.includeVersionByRegex(/org\.apache\.lucene/, '.*', ".*-snapshot-${revision}") + } + exclusiveRepo.forRepositories(luceneRepo) + } } } @@ -446,7 +453,9 @@ class BuildPlugin implements Plugin { project.pluginManager.withPlugin('com.github.johnrengelman.shadow') { // Ensure that when we are compiling against the "original" JAR that we also include any "shadow" dependencies on the compile classpath - project.configurations.getByName(JavaPlugin.API_ELEMENTS_CONFIGURATION_NAME).extendsFrom(project.configurations.getByName(ShadowBasePlugin.CONFIGURATION_NAME)) + project.configurations.getByName(ShadowBasePlugin.CONFIGURATION_NAME).dependencies.all { Dependency dependency -> + project.configurations.getByName(JavaPlugin.API_ELEMENTS_CONFIGURATION_NAME).dependencies.add(dependency) + } } } diff --git a/buildSrc/src/main/groovy/org/elasticsearch/gradle/doc/RestTestsFromSnippetsTask.groovy b/buildSrc/src/main/groovy/org/elasticsearch/gradle/doc/RestTestsFromSnippetsTask.groovy index 5b855a4d9c7b5..e271ff1a58750 100644 --- a/buildSrc/src/main/groovy/org/elasticsearch/gradle/doc/RestTestsFromSnippetsTask.groovy +++ b/buildSrc/src/main/groovy/org/elasticsearch/gradle/doc/RestTestsFromSnippetsTask.groovy @@ -314,10 +314,7 @@ class RestTestsFromSnippetsTask extends SnippetsTask { if (path == null) { path = '' // Catch requests to the root... } else { - // Escape some characters that are also escaped by sense path = path.replace('<', '%3C').replace('>', '%3E') - path = path.replace('{', '%7B').replace('}', '%7D') - path = path.replace('|', '%7C') } current.println(" - do:") if (catchPart != null) { diff --git a/buildSrc/src/main/groovy/org/elasticsearch/gradle/precommit/PrecommitTasks.groovy b/buildSrc/src/main/groovy/org/elasticsearch/gradle/precommit/PrecommitTasks.groovy index 52f98330ee50d..d95514f975a13 100644 --- a/buildSrc/src/main/groovy/org/elasticsearch/gradle/precommit/PrecommitTasks.groovy +++ b/buildSrc/src/main/groovy/org/elasticsearch/gradle/precommit/PrecommitTasks.groovy @@ -143,16 +143,13 @@ class PrecommitTasks { ExportElasticsearchBuildResourcesTask buildResources = project.tasks.getByName('buildResources') project.tasks.withType(CheckForbiddenApis).configureEach { dependsOn(buildResources) - doFirst { - // we need to defer this configuration since we don't know the runtime java version until execution time - targetCompatibility = BuildParams.runtimeJavaVersion.majorVersion - if (BuildParams.runtimeJavaVersion > JavaVersion.VERSION_13) { - project.logger.warn( - "Forbidden APIs does not support Java versions past 13. Will use the signatures from 13 for {}.", - BuildParams.runtimeJavaVersion - ) - targetCompatibility = JavaVersion.VERSION_13.majorVersion - } + targetCompatibility = BuildParams.runtimeJavaVersion.majorVersion + if (BuildParams.runtimeJavaVersion > JavaVersion.VERSION_13) { + project.logger.warn( + "Forbidden APIs does not support Java versions past 13. Will use the signatures from 13 for {}.", + BuildParams.runtimeJavaVersion + ) + targetCompatibility = JavaVersion.VERSION_13.majorVersion } bundledSignatures = [ "jdk-unsafe", "jdk-deprecated", "jdk-non-portable", "jdk-system-out" diff --git a/buildSrc/src/main/java/org/elasticsearch/gradle/Architecture.java b/buildSrc/src/main/java/org/elasticsearch/gradle/Architecture.java new file mode 100644 index 0000000000000..f230d9af86e11 --- /dev/null +++ b/buildSrc/src/main/java/org/elasticsearch/gradle/Architecture.java @@ -0,0 +1,40 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ + +package org.elasticsearch.gradle; + +public enum Architecture { + + X64, + AARCH64; + + public static Architecture current() { + final String architecture = System.getProperty("os.arch", ""); + switch (architecture) { + case "amd64": + case "x86_64": + return X64; + case "aarch64": + return AARCH64; + default: + throw new IllegalArgumentException("can not determine architecture from [" + architecture + "]"); + } + } + +} diff --git a/buildSrc/src/main/java/org/elasticsearch/gradle/DistributionDownloadPlugin.java b/buildSrc/src/main/java/org/elasticsearch/gradle/DistributionDownloadPlugin.java index 0982fe0267cc5..048403d83f14d 100644 --- a/buildSrc/src/main/java/org/elasticsearch/gradle/DistributionDownloadPlugin.java +++ b/buildSrc/src/main/java/org/elasticsearch/gradle/DistributionDownloadPlugin.java @@ -26,7 +26,7 @@ import org.elasticsearch.gradle.docker.DockerSupportService; import org.elasticsearch.gradle.info.BuildParams; import org.elasticsearch.gradle.info.GlobalBuildInfoPlugin; -import org.elasticsearch.gradle.tool.Boilerplate; +import org.elasticsearch.gradle.util.GradleUtils; import org.gradle.api.GradleException; import org.gradle.api.NamedDomainObjectContainer; import org.gradle.api.Plugin; @@ -52,7 +52,7 @@ import java.util.concurrent.Callable; import java.util.function.Supplier; -import static org.elasticsearch.gradle.Util.capitalize; +import static org.elasticsearch.gradle.util.Util.capitalize; /** * A plugin to manage getting and extracting distributions of Elasticsearch. @@ -77,7 +77,7 @@ public void apply(Project project) { project.getRootProject().getPluginManager().apply(GlobalBuildInfoPlugin.class); project.getRootProject().getPluginManager().apply(DockerSupportPlugin.class); - Provider dockerSupport = Boilerplate.getBuildService( + Provider dockerSupport = GradleUtils.getBuildService( project.getGradle().getSharedServices(), DockerSupportPlugin.DOCKER_SUPPORT_SERVICE_NAME ); diff --git a/buildSrc/src/main/java/org/elasticsearch/gradle/Jdk.java b/buildSrc/src/main/java/org/elasticsearch/gradle/Jdk.java index 42df062ba3bb0..e4c06560ab2b6 100644 --- a/buildSrc/src/main/java/org/elasticsearch/gradle/Jdk.java +++ b/buildSrc/src/main/java/org/elasticsearch/gradle/Jdk.java @@ -33,6 +33,7 @@ public class Jdk implements Buildable, Iterable { + private static final List ALLOWED_ARCHITECTURES = List.of("aarch64", "x64"); private static final List ALLOWED_VENDORS = List.of("adoptopenjdk", "openjdk"); private static final List ALLOWED_PLATFORMS = List.of("darwin", "linux", "windows", "mac"); private static final Pattern VERSION_PATTERN = Pattern.compile("(\\d+)(\\.\\d+\\.\\d+)?\\+(\\d+(?:\\.\\d+)?)(@([a-f0-9]{32}))?"); @@ -44,6 +45,7 @@ public class Jdk implements Buildable, Iterable { private final Property vendor; private final Property version; private final Property platform; + private final Property architecture; private String baseVersion; private String major; private String build; @@ -55,6 +57,7 @@ public class Jdk implements Buildable, Iterable { this.vendor = objectFactory.property(String.class); this.version = objectFactory.property(String.class); this.platform = objectFactory.property(String.class); + this.architecture = objectFactory.property(String.class); } public String getName() { @@ -97,6 +100,19 @@ public void setPlatform(String platform) { this.platform.set(platform); } + public String getArchitecture() { + return architecture.get(); + } + + public void setArchitecture(final String architecture) { + if (ALLOWED_ARCHITECTURES.contains(architecture) == false) { + throw new IllegalArgumentException( + "unknown architecture [" + architecture + "] for jdk [" + name + "], must be one of " + ALLOWED_ARCHITECTURES + ); + } + this.architecture.set(architecture); + } + public String getBaseVersion() { return baseVersion; } @@ -153,9 +169,13 @@ void finalizeValues() { if (vendor.isPresent() == false) { throw new IllegalArgumentException("vendor not specified for jdk [" + name + "]"); } + if (architecture.isPresent() == false) { + throw new IllegalArgumentException("architecture not specified for jdk [" + name + "]"); + } version.finalizeValue(); platform.finalizeValue(); vendor.finalizeValue(); + architecture.finalizeValue(); } @Override diff --git a/buildSrc/src/main/java/org/elasticsearch/gradle/JdkDownloadPlugin.java b/buildSrc/src/main/java/org/elasticsearch/gradle/JdkDownloadPlugin.java index b63d8f65ea04f..a2dca7c247c0d 100644 --- a/buildSrc/src/main/java/org/elasticsearch/gradle/JdkDownloadPlugin.java +++ b/buildSrc/src/main/java/org/elasticsearch/gradle/JdkDownloadPlugin.java @@ -47,9 +47,8 @@ import java.util.concurrent.Callable; import java.util.stream.StreamSupport; -import static org.elasticsearch.gradle.Util.capitalize; -import static org.elasticsearch.gradle.tool.Boilerplate.findByName; -import static org.elasticsearch.gradle.tool.Boilerplate.maybeCreate; +import static org.elasticsearch.gradle.util.GradleUtils.findByName; +import static org.elasticsearch.gradle.util.GradleUtils.maybeCreate; public class JdkDownloadPlugin implements Plugin { @@ -72,7 +71,10 @@ public void apply(Project project) { DependencyHandler dependencies = project.getDependencies(); Map depConfig = new HashMap<>(); depConfig.put("path", ":"); // root project - depConfig.put("configuration", configName("extracted_jdk", jdk.getVendor(), jdk.getVersion(), jdk.getPlatform())); + depConfig.put( + "configuration", + configName("extracted_jdk", jdk.getVendor(), jdk.getVersion(), jdk.getPlatform(), jdk.getArchitecture()) + ); project.getDependencies().add(jdk.getConfigurationName(), dependencies.project(depConfig)); // ensure a root level jdk download task exists @@ -87,7 +89,14 @@ public static NamedDomainObjectContainer getContainer(Project project) { } private static void setupRootJdkDownload(Project rootProject, Jdk jdk) { - String extractTaskName = "extract" + capitalize(jdk.getPlatform()) + "Jdk-" + jdk.getVendor() + "-" + jdk.getVersion(); + String extractTaskName = String.format( + Locale.ROOT, + "extract-%s-%s-jdk-%s-%s", + jdk.getPlatform(), + jdk.getArchitecture(), + jdk.getVendor(), + jdk.getVersion() + ); // Skip setup if we've already configured a JDK for this platform, vendor and version if (findByName(rootProject.getTasks(), extractTaskName) == null) { @@ -107,7 +116,7 @@ private static void setupRootJdkDownload(Project rootProject, Jdk jdk) { repoUrl = "https://artifactory.elstc.co/artifactory/oss-jdk-local/"; artifactPattern = String.format( Locale.ROOT, - "adoptopenjdk/OpenJDK%sU-jdk_x64_[module]_hotspot_[revision]_%s.[ext]", + "adoptopenjdk/OpenJDK%sU-jdk_[classifier]_[module]_hotspot_[revision]_%s.[ext]", jdk.getMajor(), jdk.getBuild() ); @@ -121,14 +130,14 @@ private static void setupRootJdkDownload(Project rootProject, Jdk jdk) { + jdk.getHash() + "/" + jdk.getBuild() - + "/GPL/openjdk-[revision]_[module]-x64_bin.[ext]"; + + "/GPL/openjdk-[revision]_[module]-[classifier]_bin.[ext]"; } else { // simpler legacy pattern from JDK 9 to JDK 12 that we are advocating to Oracle to bring back artifactPattern = "java/GA/jdk" + jdk.getMajor() + "/" + jdk.getBuild() - + "/GPL/openjdk-[revision]_[module]-x64_bin.[ext]"; + + "/GPL/openjdk-[revision]_[module]-[classifier]_bin.[ext]"; } } else { throw new GradleException("Unknown JDK vendor [" + jdk.getVendor() + "]"); @@ -150,14 +159,14 @@ private static void setupRootJdkDownload(Project rootProject, Jdk jdk) { // Declare a configuration and dependency from which to download the remote JDK final ConfigurationContainer configurations = rootProject.getConfigurations(); - String downloadConfigName = configName(jdk.getVendor(), jdk.getVersion(), jdk.getPlatform()); + String downloadConfigName = configName(jdk.getVendor(), jdk.getVersion(), jdk.getPlatform(), jdk.getArchitecture()); Configuration downloadConfiguration = maybeCreate(configurations, downloadConfigName); rootProject.getDependencies().add(downloadConfigName, dependencyNotation(jdk)); // Create JDK extract task final Provider extractPath = rootProject.getLayout() .getBuildDirectory() - .dir("jdks/" + jdk.getVendor() + "-" + jdk.getBaseVersion() + "_" + jdk.getPlatform()); + .dir("jdks/" + jdk.getVendor() + "-" + jdk.getBaseVersion() + "_" + jdk.getPlatform() + "_" + jdk.getArchitecture()); TaskProvider extractTask = createExtractTask( extractTaskName, @@ -168,7 +177,13 @@ private static void setupRootJdkDownload(Project rootProject, Jdk jdk) { ); // Declare a configuration for the extracted JDK archive - String artifactConfigName = configName("extracted_jdk", jdk.getVendor(), jdk.getVersion(), jdk.getPlatform()); + String artifactConfigName = configName( + "extracted_jdk", + jdk.getVendor(), + jdk.getVersion(), + jdk.getPlatform(), + jdk.getArchitecture() + ); maybeCreate(configurations, artifactConfigName); rootProject.getArtifacts().add(artifactConfigName, extractPath, artifact -> artifact.builtBy(extractTask)); } @@ -254,7 +269,7 @@ private static String dependencyNotation(Jdk jdk) { : jdk.getPlatform(); String extension = jdk.getPlatform().equals("windows") ? "zip" : "tar.gz"; - return groupName(jdk) + ":" + platformDep + ":" + jdk.getBaseVersion() + "@" + extension; + return groupName(jdk) + ":" + platformDep + ":" + jdk.getBaseVersion() + ":" + jdk.getArchitecture() + "@" + extension; } private static String groupName(Jdk jdk) { diff --git a/buildSrc/src/main/java/org/elasticsearch/gradle/info/GenerateGlobalBuildInfoTask.java b/buildSrc/src/main/java/org/elasticsearch/gradle/info/GenerateGlobalBuildInfoTask.java deleted file mode 100644 index 75ae735e8a9f1..0000000000000 --- a/buildSrc/src/main/java/org/elasticsearch/gradle/info/GenerateGlobalBuildInfoTask.java +++ /dev/null @@ -1,293 +0,0 @@ -package org.elasticsearch.gradle.info; - -import org.elasticsearch.gradle.OS; -import org.gradle.api.DefaultTask; -import org.gradle.api.GradleException; -import org.gradle.api.JavaVersion; -import org.gradle.api.file.RegularFileProperty; -import org.gradle.api.model.ObjectFactory; -import org.gradle.api.tasks.CacheableTask; -import org.gradle.api.tasks.Input; -import org.gradle.api.tasks.InputDirectory; -import org.gradle.api.tasks.Nested; -import org.gradle.api.tasks.OutputFile; -import org.gradle.api.tasks.PathSensitive; -import org.gradle.api.tasks.PathSensitivity; -import org.gradle.api.tasks.TaskAction; -import org.gradle.internal.jvm.Jvm; -import org.gradle.process.ExecResult; - -import javax.inject.Inject; -import java.io.BufferedWriter; -import java.io.ByteArrayOutputStream; -import java.io.File; -import java.io.FileWriter; -import java.io.IOException; -import java.io.UncheckedIOException; -import java.io.Writer; -import java.nio.file.Files; -import java.util.Arrays; -import java.util.List; -import java.util.Locale; - -import static java.nio.charset.StandardCharsets.UTF_8; - -@CacheableTask -public class GenerateGlobalBuildInfoTask extends DefaultTask { - private JavaVersion minimumCompilerVersion; - private JavaVersion minimumRuntimeVersion; - private File compilerJavaHome; - private File runtimeJavaHome; - private List javaVersions; - private final RegularFileProperty outputFile; - private final RegularFileProperty compilerVersionFile; - private final RegularFileProperty runtimeVersionFile; - - @Inject - public GenerateGlobalBuildInfoTask(ObjectFactory objectFactory) { - this.outputFile = objectFactory.fileProperty(); - this.compilerVersionFile = objectFactory.fileProperty(); - this.runtimeVersionFile = objectFactory.fileProperty(); - } - - @Input - public JavaVersion getMinimumCompilerVersion() { - return minimumCompilerVersion; - } - - public void setMinimumCompilerVersion(JavaVersion minimumCompilerVersion) { - this.minimumCompilerVersion = minimumCompilerVersion; - } - - @Input - public JavaVersion getMinimumRuntimeVersion() { - return minimumRuntimeVersion; - } - - public void setMinimumRuntimeVersion(JavaVersion minimumRuntimeVersion) { - this.minimumRuntimeVersion = minimumRuntimeVersion; - } - - @InputDirectory - @PathSensitive(PathSensitivity.RELATIVE) - public File getCompilerJavaHome() { - return compilerJavaHome; - } - - public void setCompilerJavaHome(File compilerJavaHome) { - this.compilerJavaHome = compilerJavaHome; - } - - @InputDirectory - @PathSensitive(PathSensitivity.RELATIVE) - public File getRuntimeJavaHome() { - return runtimeJavaHome; - } - - public void setRuntimeJavaHome(File runtimeJavaHome) { - this.runtimeJavaHome = runtimeJavaHome; - } - - @Nested - public List getJavaVersions() { - return javaVersions; - } - - public void setJavaVersions(List javaVersions) { - this.javaVersions = javaVersions; - } - - @OutputFile - public RegularFileProperty getOutputFile() { - return outputFile; - } - - @OutputFile - public RegularFileProperty getCompilerVersionFile() { - return compilerVersionFile; - } - - @OutputFile - public RegularFileProperty getRuntimeVersionFile() { - return runtimeVersionFile; - } - - @TaskAction - public void generate() { - String javaVendorVersion = System.getProperty("java.vendor.version", System.getProperty("java.vendor")); - String gradleJavaVersion = System.getProperty("java.version"); - String gradleJavaVersionDetails = javaVendorVersion - + " " - + gradleJavaVersion - + " [" - + System.getProperty("java.vm.name") - + " " - + System.getProperty("java.vm.version") - + "]"; - - String compilerJavaVersionDetails = gradleJavaVersionDetails; - JavaVersion compilerJavaVersionEnum = JavaVersion.current(); - String runtimeJavaVersionDetails = gradleJavaVersionDetails; - JavaVersion runtimeJavaVersionEnum = JavaVersion.current(); - File gradleJavaHome = Jvm.current().getJavaHome(); - - try { - if (Files.isSameFile(compilerJavaHome.toPath(), gradleJavaHome.toPath()) == false) { - if (compilerJavaHome.exists()) { - compilerJavaVersionDetails = findJavaVersionDetails(compilerJavaHome); - compilerJavaVersionEnum = JavaVersion.toVersion(findJavaSpecificationVersion(compilerJavaHome)); - } else { - throw new RuntimeException("Compiler Java home path of '" + compilerJavaHome + "' does not exist"); - } - } - - if (Files.isSameFile(runtimeJavaHome.toPath(), gradleJavaHome.toPath()) == false) { - if (runtimeJavaHome.exists()) { - runtimeJavaVersionDetails = findJavaVersionDetails(runtimeJavaHome); - runtimeJavaVersionEnum = JavaVersion.toVersion(findJavaSpecificationVersion(runtimeJavaHome)); - } else { - throw new RuntimeException("Runtime Java home path of '" + compilerJavaHome + "' does not exist"); - } - } - } catch (IOException e) { - throw new UncheckedIOException(e); - } - - try (BufferedWriter writer = new BufferedWriter(new FileWriter(outputFile.getAsFile().get()))) { - final String osName = System.getProperty("os.name"); - final String osVersion = System.getProperty("os.version"); - final String osArch = System.getProperty("os.arch"); - final JavaVersion parsedVersion = JavaVersion.toVersion(gradleJavaVersion); - - writer.write(" Gradle Version : " + getProject().getGradle().getGradleVersion() + "\n"); - writer.write(" OS Info : " + osName + " " + osVersion + " (" + osArch + ")\n"); - - if (gradleJavaVersionDetails.equals(compilerJavaVersionDetails) == false - || gradleJavaVersionDetails.equals(runtimeJavaVersionDetails) == false) { - writer.write(" Compiler JDK Version : " + compilerJavaVersionEnum + " (" + compilerJavaVersionDetails + ")\n"); - writer.write(" Compiler java.home : " + compilerJavaHome + "\n"); - writer.write(" Runtime JDK Version : " + runtimeJavaVersionEnum + " (" + runtimeJavaVersionDetails + ")\n"); - writer.write(" Runtime java.home : " + runtimeJavaHome + "\n"); - writer.write(" Gradle JDK Version : " + parsedVersion + " (" + gradleJavaVersionDetails + ")\n"); - writer.write(" Gradle java.home : " + gradleJavaHome); - } else { - writer.write(" JDK Version : " + parsedVersion + " (" + gradleJavaVersionDetails + ")\n"); - writer.write(" JAVA_HOME : " + gradleJavaHome); - } - } catch (IOException e) { - throw new UncheckedIOException(e); - } - - // enforce Java version - if (compilerJavaVersionEnum.compareTo(minimumCompilerVersion) < 0) { - String message = String.format( - Locale.ROOT, - "The compiler java.home must be set to a JDK installation directory for Java %s but is [%s] " + "corresponding to [%s]", - minimumCompilerVersion, - compilerJavaHome, - compilerJavaVersionEnum - ); - throw new GradleException(message); - } - - if (runtimeJavaVersionEnum.compareTo(minimumRuntimeVersion) < 0) { - String message = String.format( - Locale.ROOT, - "The runtime java.home must be set to a JDK installation directory for Java %s but is [%s] " + "corresponding to [%s]", - minimumRuntimeVersion, - runtimeJavaHome, - runtimeJavaVersionEnum - ); - throw new GradleException(message); - } - - for (JavaHome javaVersion : javaVersions) { - File javaHome = javaVersion.getJavaHome(); - if (javaHome == null) { - continue; - } - JavaVersion javaVersionEnum = JavaVersion.toVersion(findJavaSpecificationVersion(javaHome)); - JavaVersion expectedJavaVersionEnum; - int version = javaVersion.getVersion(); - if (version < 9) { - expectedJavaVersionEnum = JavaVersion.toVersion("1." + version); - } else { - expectedJavaVersionEnum = JavaVersion.toVersion(Integer.toString(version)); - } - if (javaVersionEnum != expectedJavaVersionEnum) { - String message = String.format( - Locale.ROOT, - "The environment variable JAVA%d_HOME must be set to a JDK installation directory for Java" - + " %s but is [%s] corresponding to [%s]", - version, - expectedJavaVersionEnum, - javaHome, - javaVersionEnum - ); - throw new GradleException(message); - } - } - - writeToFile(compilerVersionFile.getAsFile().get(), compilerJavaVersionEnum.name()); - writeToFile(runtimeVersionFile.getAsFile().get(), runtimeJavaVersionEnum.name()); - } - - private void writeToFile(File file, String content) { - try (Writer writer = new FileWriter(file)) { - writer.write(content); - } catch (IOException e) { - throw new UncheckedIOException(e); - } - } - - /** - * Finds printable java version of the given JAVA_HOME - */ - private String findJavaVersionDetails(File javaHome) { - String versionInfoScript = "print(" - + "java.lang.System.getProperty(\"java.vendor.version\", java.lang.System.getProperty(\"java.vendor\")) + \" \" + " - + "java.lang.System.getProperty(\"java.version\") + \" [\" + " - + "java.lang.System.getProperty(\"java.vm.name\") + \" \" + " - + "java.lang.System.getProperty(\"java.vm.version\") + \"]\");"; - return runJavaAsScript(javaHome, versionInfoScript).trim(); - } - - /** - * Finds the parsable java specification version - */ - private String findJavaSpecificationVersion(File javaHome) { - String versionScript = "print(java.lang.System.getProperty(\"java.specification.version\"));"; - return runJavaAsScript(javaHome, versionScript); - } - - /** - * Runs the given javascript using jjs from the jdk, and returns the output - */ - private String runJavaAsScript(File javaHome, String script) { - ByteArrayOutputStream stdout = new ByteArrayOutputStream(); - ByteArrayOutputStream stderr = new ByteArrayOutputStream(); - if (OS.current() == OS.WINDOWS) { - // gradle/groovy does not properly escape the double quote for windows - script = script.replace("\"", "\\\""); - } - File jrunscriptPath = new File(javaHome, "bin/jrunscript"); - String finalScript = script; - ExecResult result = getProject().exec(spec -> { - spec.setExecutable(jrunscriptPath); - spec.args("-e", finalScript); - spec.setStandardOutput(stdout); - spec.setErrorOutput(stderr); - spec.setIgnoreExitValue(true); - }); - - if (result.getExitValue() != 0) { - getLogger().error("STDOUT:"); - Arrays.stream(stdout.toString(UTF_8).split(System.getProperty("line.separator"))).forEach(getLogger()::error); - getLogger().error("STDERR:"); - Arrays.stream(stderr.toString(UTF_8).split(System.getProperty("line.separator"))).forEach(getLogger()::error); - result.rethrowFailure(); - } - return stdout.toString(UTF_8).trim(); - } - -} diff --git a/buildSrc/src/main/java/org/elasticsearch/gradle/info/GlobalBuildInfoPlugin.java b/buildSrc/src/main/java/org/elasticsearch/gradle/info/GlobalBuildInfoPlugin.java index 431e66e9cde94..21642b30c0480 100644 --- a/buildSrc/src/main/java/org/elasticsearch/gradle/info/GlobalBuildInfoPlugin.java +++ b/buildSrc/src/main/java/org/elasticsearch/gradle/info/GlobalBuildInfoPlugin.java @@ -1,18 +1,27 @@ package org.elasticsearch.gradle.info; import org.elasticsearch.gradle.OS; +import org.elasticsearch.gradle.util.Util; import org.gradle.api.GradleException; import org.gradle.api.JavaVersion; import org.gradle.api.Plugin; import org.gradle.api.Project; +import org.gradle.api.logging.Logger; +import org.gradle.api.logging.Logging; +import org.gradle.api.model.ObjectFactory; +import org.gradle.api.provider.Provider; +import org.gradle.api.provider.ProviderFactory; import org.gradle.internal.jvm.Jvm; +import org.gradle.jvm.toolchain.JavaInstallation; +import org.gradle.jvm.toolchain.JavaInstallationRegistry; +import org.gradle.util.GradleVersion; +import javax.inject.Inject; import java.io.BufferedReader; import java.io.ByteArrayOutputStream; import java.io.File; import java.io.FileReader; import java.io.IOException; -import java.io.InputStreamReader; import java.io.UncheckedIOException; import java.nio.charset.StandardCharsets; import java.nio.file.Files; @@ -33,102 +42,168 @@ import java.util.stream.Stream; public class GlobalBuildInfoPlugin implements Plugin { - private static final String GLOBAL_INFO_EXTENSION_NAME = "globalInfo"; + private static final Logger LOGGER = Logging.getLogger(GlobalBuildInfoPlugin.class); private static Integer _defaultParallel = null; + private final JavaInstallationRegistry javaInstallationRegistry; + private final ObjectFactory objects; + private final ProviderFactory providers; + + @Inject + public GlobalBuildInfoPlugin(JavaInstallationRegistry javaInstallationRegistry, ObjectFactory objects, ProviderFactory providers) { + this.javaInstallationRegistry = javaInstallationRegistry; + this.objects = objects; + this.providers = providers; + } + @Override public void apply(Project project) { if (project != project.getRootProject()) { throw new IllegalStateException(this.getClass().getName() + " can only be applied to the root project."); } - GlobalInfoExtension extension = project.getExtensions().create(GLOBAL_INFO_EXTENSION_NAME, GlobalInfoExtension.class); - - JavaVersion minimumCompilerVersion = JavaVersion.toVersion(getResourceContents("/minimumCompilerVersion")); - JavaVersion minimumRuntimeVersion = JavaVersion.toVersion(getResourceContents("/minimumRuntimeVersion")); + JavaVersion minimumCompilerVersion = JavaVersion.toVersion(Util.getResourceContents("/minimumCompilerVersion")); + JavaVersion minimumRuntimeVersion = JavaVersion.toVersion(Util.getResourceContents("/minimumRuntimeVersion")); File compilerJavaHome = findCompilerJavaHome(); File runtimeJavaHome = findRuntimeJavaHome(compilerJavaHome); - String testSeedProperty = System.getProperty("tests.seed"); - final String testSeed; - if (testSeedProperty == null) { - long seed = new Random(System.currentTimeMillis()).nextLong(); - testSeed = Long.toUnsignedString(seed, 16).toUpperCase(Locale.ROOT); - } else { - testSeed = testSeedProperty; - } - - final String buildSnapshotSystemProperty = System.getProperty("build.snapshot", "true"); - final boolean isSnapshotBuild; - switch (buildSnapshotSystemProperty) { - case "true": - isSnapshotBuild = true; - break; - case "false": - isSnapshotBuild = false; - break; - default: - throw new IllegalArgumentException( - "build.snapshot was set to [" + buildSnapshotSystemProperty + "] but can only be unset or [true|false]" - ); - } - final List javaVersions = new ArrayList<>(); - for (int version = 8; version <= Integer.parseInt(minimumCompilerVersion.getMajorVersion()); version++) { - if (System.getenv(getJavaHomeEnvVarName(Integer.toString(version))) != null) { - javaVersions.add(JavaHome.of(version, new File(findJavaHome(Integer.toString(version))))); - } - } - - GenerateGlobalBuildInfoTask generateTask = project.getTasks() - .create("generateGlobalBuildInfo", GenerateGlobalBuildInfoTask.class, task -> { - task.setJavaVersions(javaVersions); - task.setMinimumCompilerVersion(minimumCompilerVersion); - task.setMinimumRuntimeVersion(minimumRuntimeVersion); - task.setCompilerJavaHome(compilerJavaHome); - task.setRuntimeJavaHome(runtimeJavaHome); - task.getOutputFile().set(new File(project.getBuildDir(), "global-build-info")); - task.getCompilerVersionFile().set(new File(project.getBuildDir(), "java-compiler-version")); - task.getRuntimeVersionFile().set(new File(project.getBuildDir(), "java-runtime-version")); - }); - - PrintGlobalBuildInfoTask printTask = project.getTasks().create("printGlobalBuildInfo", PrintGlobalBuildInfoTask.class, task -> { - task.getBuildInfoFile().set(generateTask.getOutputFile()); - task.getCompilerVersionFile().set(generateTask.getCompilerVersionFile()); - task.getRuntimeVersionFile().set(generateTask.getRuntimeVersionFile()); - task.setGlobalInfoListeners(extension.listeners); - }); - // Initialize global build parameters BuildParams.init(params -> { params.reset(); params.setCompilerJavaHome(compilerJavaHome); params.setRuntimeJavaHome(runtimeJavaHome); + params.setCompilerJavaVersion(determineJavaVersion("compiler java.home", compilerJavaHome, minimumCompilerVersion)); + params.setRuntimeJavaVersion(determineJavaVersion("runtime java.home", runtimeJavaHome, minimumRuntimeVersion)); params.setIsRutimeJavaHomeSet(compilerJavaHome.equals(runtimeJavaHome) == false); - params.setJavaVersions(javaVersions); + params.setJavaVersions(getAvailableJavaVersions(minimumCompilerVersion)); params.setMinimumCompilerVersion(minimumCompilerVersion); params.setMinimumRuntimeVersion(minimumRuntimeVersion); params.setGradleJavaVersion(Jvm.current().getJavaVersion()); params.setGitRevision(gitRevision(project.getRootProject().getRootDir())); params.setBuildDate(ZonedDateTime.now(ZoneOffset.UTC)); - params.setTestSeed(testSeed); + params.setTestSeed(getTestSeed()); params.setIsCi(System.getenv("JENKINS_URL") != null); params.setIsInternal(GlobalBuildInfoPlugin.class.getResource("/buildSrc.marker") != null); params.setDefaultParallel(findDefaultParallel(project)); - params.setInFipsJvm(isInFipsJvm()); - params.setIsSnapshotBuild(isSnapshotBuild); + params.setInFipsJvm(Util.getBooleanProperty("tests.fips.enabled", false)); + params.setIsSnapshotBuild(Util.getBooleanProperty("build.snapshot", true)); }); - project.allprojects( - p -> { - // Make sure than any task execution generates and prints build info - p.getTasks().configureEach(task -> { - if (task != generateTask && task != printTask) { - task.dependsOn(printTask); + // Print global build info header just before task execution + project.getGradle().getTaskGraph().whenReady(graph -> logGlobalBuildInfo()); + } + + private void logGlobalBuildInfo() { + final String osName = System.getProperty("os.name"); + final String osVersion = System.getProperty("os.version"); + final String osArch = System.getProperty("os.arch"); + final Jvm gradleJvm = Jvm.current(); + final String gradleJvmDetails = getJavaInstallation(gradleJvm.getJavaHome()).getImplementationName(); + + LOGGER.quiet("======================================="); + LOGGER.quiet("Elasticsearch Build Hamster says Hello!"); + LOGGER.quiet(" Gradle Version : " + GradleVersion.current().getVersion()); + LOGGER.quiet(" OS Info : " + osName + " " + osVersion + " (" + osArch + ")"); + if (Jvm.current().getJavaVersion().equals(BuildParams.getCompilerJavaVersion()) == false || BuildParams.getIsRuntimeJavaHomeSet()) { + String compilerJvmDetails = getJavaInstallation(BuildParams.getCompilerJavaHome()).getImplementationName(); + String runtimeJvmDetails = getJavaInstallation(BuildParams.getRuntimeJavaHome()).getImplementationName(); + + LOGGER.quiet(" Compiler JDK Version : " + BuildParams.getCompilerJavaVersion() + " (" + compilerJvmDetails + ")"); + LOGGER.quiet(" Compiler java.home : " + BuildParams.getCompilerJavaHome()); + LOGGER.quiet(" Runtime JDK Version : " + BuildParams.getRuntimeJavaVersion() + " (" + runtimeJvmDetails + ")"); + LOGGER.quiet(" Runtime java.home : " + BuildParams.getRuntimeJavaHome()); + LOGGER.quiet(" Gradle JDK Version : " + gradleJvm.getJavaVersion() + " (" + gradleJvmDetails + ")"); + LOGGER.quiet(" Gradle java.home : " + gradleJvm.getJavaHome()); + } else { + LOGGER.quiet(" JDK Version : " + gradleJvm.getJavaVersion() + " (" + gradleJvmDetails + ")"); + LOGGER.quiet(" JAVA_HOME : " + gradleJvm.getJavaHome()); + } + LOGGER.quiet(" Random Testing Seed : " + BuildParams.getTestSeed()); + LOGGER.quiet(" In FIPS 140 mode : " + BuildParams.isInFipsJvm()); + LOGGER.quiet("======================================="); + } + + private JavaVersion determineJavaVersion(String description, File javaHome, JavaVersion requiredVersion) { + JavaInstallation installation = getJavaInstallation(javaHome); + JavaVersion actualVersion = installation.getJavaVersion(); + if (actualVersion.isCompatibleWith(requiredVersion) == false) { + throwInvalidJavaHomeException( + description, + javaHome, + Integer.parseInt(requiredVersion.getMajorVersion()), + Integer.parseInt(actualVersion.getMajorVersion()) + ); + } + + return actualVersion; + } + + private JavaInstallation getJavaInstallation(File javaHome) { + JavaInstallation installation; + if (isCurrentJavaHome(javaHome)) { + installation = javaInstallationRegistry.getInstallationForCurrentVirtualMachine().get(); + } else { + installation = javaInstallationRegistry.installationForDirectory(objects.directoryProperty().fileValue(javaHome)).get(); + } + + return installation; + } + + private List getAvailableJavaVersions(JavaVersion minimumCompilerVersion) { + final List javaVersions = new ArrayList<>(); + for (int v = 8; v <= Integer.parseInt(minimumCompilerVersion.getMajorVersion()); v++) { + int version = v; + String javaHomeEnvVarName = getJavaHomeEnvVarName(Integer.toString(version)); + if (System.getenv(javaHomeEnvVarName) != null) { + File javaHomeDirectory = new File(findJavaHome(Integer.toString(version))); + Provider javaInstallationProvider = javaInstallationRegistry.installationForDirectory( + objects.directoryProperty().fileValue(javaHomeDirectory) + ); + JavaHome javaHome = JavaHome.of(version, providers.provider(() -> { + int actualVersion = Integer.parseInt(javaInstallationProvider.get().getJavaVersion().getMajorVersion()); + if (actualVersion != version) { + throwInvalidJavaHomeException("env variable " + javaHomeEnvVarName, javaHomeDirectory, version, actualVersion); } - }); + return javaHomeDirectory; + })); + javaVersions.add(javaHome); } + } + return javaVersions; + } + + private static boolean isCurrentJavaHome(File javaHome) { + try { + return Files.isSameFile(javaHome.toPath(), Jvm.current().getJavaHome().toPath()); + } catch (IOException e) { + throw new UncheckedIOException(e); + } + } + + private static String getTestSeed() { + String testSeedProperty = System.getProperty("tests.seed"); + final String testSeed; + if (testSeedProperty == null) { + long seed = new Random(System.currentTimeMillis()).nextLong(); + testSeed = Long.toUnsignedString(seed, 16).toUpperCase(Locale.ROOT); + } else { + testSeed = testSeedProperty; + } + return testSeed; + } + + private static void throwInvalidJavaHomeException(String description, File javaHome, int expectedVersion, int actualVersion) { + String message = String.format( + Locale.ROOT, + "The %s must be set to a JDK installation directory for Java %d but is [%s] corresponding to [%s]", + description, + expectedVersion, + javaHome, + actualVersion ); + + throw new GradleException(message); } private static File findCompilerJavaHome() { @@ -174,28 +249,6 @@ private static String getJavaHomeEnvVarName(String version) { return "JAVA" + version + "_HOME"; } - private static boolean isInFipsJvm() { - return Boolean.parseBoolean(System.getProperty("tests.fips.enabled")); - } - - private static String getResourceContents(String resourcePath) { - try ( - BufferedReader reader = new BufferedReader(new InputStreamReader(GlobalBuildInfoPlugin.class.getResourceAsStream(resourcePath))) - ) { - StringBuilder b = new StringBuilder(); - for (String line = reader.readLine(); line != null; line = reader.readLine()) { - if (b.length() != 0) { - b.append('\n'); - } - b.append(line); - } - - return b.toString(); - } catch (IOException e) { - throw new UncheckedIOException("Error trying to read classpath resource: " + resourcePath, e); - } - } - private static int findDefaultParallel(Project project) { // Since it costs IO to compute this, and is done at configuration time we want to cache this if possible // It's safe to store this in a static variable since it's just a primitive so leaking memory isn't an issue diff --git a/buildSrc/src/main/java/org/elasticsearch/gradle/info/GlobalInfoExtension.java b/buildSrc/src/main/java/org/elasticsearch/gradle/info/GlobalInfoExtension.java deleted file mode 100644 index a2daa4a5767c0..0000000000000 --- a/buildSrc/src/main/java/org/elasticsearch/gradle/info/GlobalInfoExtension.java +++ /dev/null @@ -1,12 +0,0 @@ -package org.elasticsearch.gradle.info; - -import java.util.ArrayList; -import java.util.List; - -public class GlobalInfoExtension { - final List listeners = new ArrayList<>(); - - public void ready(Runnable block) { - listeners.add(block); - } -} diff --git a/buildSrc/src/main/java/org/elasticsearch/gradle/info/PrintGlobalBuildInfoTask.java b/buildSrc/src/main/java/org/elasticsearch/gradle/info/PrintGlobalBuildInfoTask.java deleted file mode 100644 index fdd79e0fc9cb3..0000000000000 --- a/buildSrc/src/main/java/org/elasticsearch/gradle/info/PrintGlobalBuildInfoTask.java +++ /dev/null @@ -1,75 +0,0 @@ -package org.elasticsearch.gradle.info; - -import org.gradle.api.DefaultTask; -import org.gradle.api.JavaVersion; -import org.gradle.api.file.RegularFileProperty; -import org.gradle.api.model.ObjectFactory; -import org.gradle.api.resources.TextResource; -import org.gradle.api.tasks.InputFile; -import org.gradle.api.tasks.TaskAction; - -import javax.inject.Inject; -import java.util.ArrayList; -import java.util.List; - -public class PrintGlobalBuildInfoTask extends DefaultTask { - private final RegularFileProperty buildInfoFile; - private final RegularFileProperty compilerVersionFile; - private final RegularFileProperty runtimeVersionFile; - private List globalInfoListeners = new ArrayList<>(); - - @Inject - public PrintGlobalBuildInfoTask(ObjectFactory objectFactory) { - this.buildInfoFile = objectFactory.fileProperty(); - this.compilerVersionFile = objectFactory.fileProperty(); - this.runtimeVersionFile = objectFactory.fileProperty(); - } - - @InputFile - public RegularFileProperty getBuildInfoFile() { - return buildInfoFile; - } - - @InputFile - public RegularFileProperty getCompilerVersionFile() { - return compilerVersionFile; - } - - @InputFile - public RegularFileProperty getRuntimeVersionFile() { - return runtimeVersionFile; - } - - public void setGlobalInfoListeners(List globalInfoListeners) { - this.globalInfoListeners = globalInfoListeners; - } - - @TaskAction - public void print() { - getLogger().quiet("======================================="); - getLogger().quiet("Elasticsearch Build Hamster says Hello!"); - getLogger().quiet(getFileText(getBuildInfoFile()).asString()); - getLogger().quiet(" Random Testing Seed : " + BuildParams.getTestSeed()); - getLogger().quiet(" In FIPS 140 mode : " + BuildParams.isInFipsJvm()); - getLogger().quiet("======================================="); - - setGlobalProperties(); - globalInfoListeners.forEach(Runnable::run); - - // Since all tasks depend on this task, and it always runs for every build, this makes sure that lifecycle tasks will still - // correctly report as UP-TO-DATE, since the convention is a lifecycle task (i.e. assemble, build, etc) will only be marked as - // UP-TO-DATE if all upstream tasks were also UP-TO-DATE. - setDidWork(false); - } - - private TextResource getFileText(RegularFileProperty regularFileProperty) { - return getProject().getResources().getText().fromFile(regularFileProperty.getAsFile().get()); - } - - private void setGlobalProperties() { - BuildParams.init(params -> { - params.setCompilerJavaVersion(JavaVersion.valueOf(getFileText(getCompilerVersionFile()).asString())); - params.setRuntimeJavaVersion(JavaVersion.valueOf(getFileText(getRuntimeVersionFile()).asString())); - }); - } -} diff --git a/buildSrc/src/main/java/org/elasticsearch/gradle/precommit/FilePermissionsTask.java b/buildSrc/src/main/java/org/elasticsearch/gradle/precommit/FilePermissionsTask.java index 68839f86c1035..35ad889b34125 100644 --- a/buildSrc/src/main/java/org/elasticsearch/gradle/precommit/FilePermissionsTask.java +++ b/buildSrc/src/main/java/org/elasticsearch/gradle/precommit/FilePermissionsTask.java @@ -28,7 +28,7 @@ import java.util.stream.Collectors; import org.apache.tools.ant.taskdefs.condition.Os; -import org.elasticsearch.gradle.tool.Boilerplate; +import org.elasticsearch.gradle.util.GradleUtils; import org.gradle.api.DefaultTask; import org.gradle.api.GradleException; import org.gradle.api.file.FileCollection; @@ -80,7 +80,7 @@ private static boolean isExecutableFile(File file) { @InputFiles @SkipWhenEmpty public FileCollection getFiles() { - return Boilerplate.getJavaSourceSets(getProject()) + return GradleUtils.getJavaSourceSets(getProject()) .stream() .map(sourceSet -> sourceSet.getAllSource().matching(filesFilter)) .reduce(FileTree::plus) diff --git a/buildSrc/src/main/java/org/elasticsearch/gradle/precommit/TestingConventionsTasks.java b/buildSrc/src/main/java/org/elasticsearch/gradle/precommit/TestingConventionsTasks.java index a662ff4dbcaeb..3657808b91979 100644 --- a/buildSrc/src/main/java/org/elasticsearch/gradle/precommit/TestingConventionsTasks.java +++ b/buildSrc/src/main/java/org/elasticsearch/gradle/precommit/TestingConventionsTasks.java @@ -19,7 +19,7 @@ package org.elasticsearch.gradle.precommit; import groovy.lang.Closure; -import org.elasticsearch.gradle.tool.Boilerplate; +import org.elasticsearch.gradle.util.GradleUtils; import org.gradle.api.DefaultTask; import org.gradle.api.NamedDomainObjectContainer; import org.gradle.api.Task; @@ -65,7 +65,7 @@ public class TestingConventionsTasks extends DefaultTask { public TestingConventionsTasks() { setDescription("Tests various testing conventions"); // Run only after everything is compiled - Boilerplate.getJavaSourceSets(getProject()).all(sourceSet -> dependsOn(sourceSet.getOutput().getClassesDirs())); + GradleUtils.getJavaSourceSets(getProject()).all(sourceSet -> dependsOn(sourceSet.getOutput().getClassesDirs())); naming = getProject().container(TestingConventionRule.class); } @@ -81,7 +81,7 @@ public Map> getClassFilesPerEnabledTask() { @Input public Map getTestClassNames() { if (testClassNames == null) { - testClassNames = Boilerplate.getJavaSourceSets(getProject()) + testClassNames = GradleUtils.getJavaSourceSets(getProject()) .getByName("test") .getOutput() .getClassesDirs() @@ -110,7 +110,7 @@ public void naming(Closure action) { @Input public Set getMainClassNamedLikeTests() { - SourceSetContainer javaSourceSets = Boilerplate.getJavaSourceSets(getProject()); + SourceSetContainer javaSourceSets = GradleUtils.getJavaSourceSets(getProject()); if (javaSourceSets.findByName(SourceSet.MAIN_SOURCE_SET_NAME) == null) { // some test projects don't have a main source set return Collections.emptySet(); @@ -351,7 +351,7 @@ private FileCollection getTestsClassPath() { // running the tests. return getProject().files( getProject().getConfigurations().getByName("testRuntime").resolve(), - Boilerplate.getJavaSourceSets(getProject()) + GradleUtils.getJavaSourceSets(getProject()) .stream() .flatMap(sourceSet -> sourceSet.getOutput().getClassesDirs().getFiles().stream()) .collect(Collectors.toList()) diff --git a/buildSrc/src/main/java/org/elasticsearch/gradle/test/DistroTestPlugin.java b/buildSrc/src/main/java/org/elasticsearch/gradle/test/DistroTestPlugin.java index 8b62e81e626e8..9826c50836e11 100644 --- a/buildSrc/src/main/java/org/elasticsearch/gradle/test/DistroTestPlugin.java +++ b/buildSrc/src/main/java/org/elasticsearch/gradle/test/DistroTestPlugin.java @@ -32,7 +32,7 @@ import org.elasticsearch.gradle.docker.DockerSupportPlugin; import org.elasticsearch.gradle.docker.DockerSupportService; import org.elasticsearch.gradle.info.BuildParams; -import org.elasticsearch.gradle.tool.Boilerplate; +import org.elasticsearch.gradle.util.GradleUtils; import org.elasticsearch.gradle.vagrant.BatsProgressLogger; import org.elasticsearch.gradle.vagrant.VagrantBasePlugin; import org.elasticsearch.gradle.vagrant.VagrantExtension; @@ -90,7 +90,7 @@ public void apply(Project project) { project.getPluginManager().apply(DistributionDownloadPlugin.class); project.getPluginManager().apply("elasticsearch.build"); - Provider dockerSupport = Boilerplate.getBuildService( + Provider dockerSupport = GradleUtils.getBuildService( project.getGradle().getSharedServices(), DockerSupportPlugin.DOCKER_SUPPORT_SERVICE_NAME ); @@ -175,12 +175,14 @@ private static Jdk createJdk( String name, String vendor, String version, - String platform + String platform, + String architecture ) { Jdk jdk = jdksContainer.create(name); jdk.setVendor(vendor); jdk.setVersion(version); jdk.setPlatform(platform); + jdk.setArchitecture(architecture); return jdk; } @@ -212,8 +214,8 @@ private static List configureVM(Project project) { NamedDomainObjectContainer jdksContainer = JdkDownloadPlugin.getContainer(project); String platform = box.contains("windows") ? "windows" : "linux"; - Jdk systemJdk = createJdk(jdksContainer, "system", SYSTEM_JDK_VENDOR, SYSTEM_JDK_VERSION, platform); - Jdk gradleJdk = createJdk(jdksContainer, "gradle", GRADLE_JDK_VENDOR, GRADLE_JDK_VERSION, platform); + Jdk systemJdk = createJdk(jdksContainer, "system", SYSTEM_JDK_VENDOR, SYSTEM_JDK_VERSION, platform, "x64"); + Jdk gradleJdk = createJdk(jdksContainer, "gradle", GRADLE_JDK_VENDOR, GRADLE_JDK_VERSION, platform, "x64"); // setup VM used by these tests VagrantExtension vagrant = project.getExtensions().getByType(VagrantExtension.class); diff --git a/buildSrc/src/main/java/org/elasticsearch/gradle/test/rest/CopyRestApiTask.java b/buildSrc/src/main/java/org/elasticsearch/gradle/test/rest/CopyRestApiTask.java index 1cb3034419bbe..a059400844c3d 100644 --- a/buildSrc/src/main/java/org/elasticsearch/gradle/test/rest/CopyRestApiTask.java +++ b/buildSrc/src/main/java/org/elasticsearch/gradle/test/rest/CopyRestApiTask.java @@ -20,7 +20,7 @@ import org.elasticsearch.gradle.VersionProperties; import org.elasticsearch.gradle.info.BuildParams; -import org.elasticsearch.gradle.tool.Boilerplate; +import org.elasticsearch.gradle.util.GradleUtils; import org.gradle.api.DefaultTask; import org.gradle.api.Project; import org.gradle.api.artifacts.Configuration; @@ -203,6 +203,6 @@ private File getTestOutputResourceDir() { } private SourceSet getTestSourceSet() { - return Boilerplate.getJavaSourceSets(getProject()).findByName("test"); + return GradleUtils.getJavaSourceSets(getProject()).findByName("test"); } } diff --git a/buildSrc/src/main/java/org/elasticsearch/gradle/test/rest/CopyRestTestsTask.java b/buildSrc/src/main/java/org/elasticsearch/gradle/test/rest/CopyRestTestsTask.java index 9e9d1e0d9e795..0d067b40edb29 100644 --- a/buildSrc/src/main/java/org/elasticsearch/gradle/test/rest/CopyRestTestsTask.java +++ b/buildSrc/src/main/java/org/elasticsearch/gradle/test/rest/CopyRestTestsTask.java @@ -20,7 +20,7 @@ import org.elasticsearch.gradle.VersionProperties; import org.elasticsearch.gradle.info.BuildParams; -import org.elasticsearch.gradle.tool.Boilerplate; +import org.elasticsearch.gradle.util.GradleUtils; import org.gradle.api.DefaultTask; import org.gradle.api.Project; import org.gradle.api.artifacts.Configuration; @@ -114,7 +114,7 @@ void copy() { if (BuildParams.isInternal()) { getLogger().debug("Rest tests for project [{}] will be copied to the test resources.", project.getPath()); project.copy(c -> { - c.from(coreConfig.getSingleFile()); + c.from(coreConfig.getAsFileTree()); c.into(getOutputDir()); c.include(corePatternSet.getIncludes()); }); @@ -138,7 +138,7 @@ void copy() { if (includeXpack.get().isEmpty() == false) { getLogger().debug("X-pack rest tests for project [{}] will be copied to the test resources.", project.getPath()); project.copy(c -> { - c.from(xpackConfig.getSingleFile()); + c.from(xpackConfig.getAsFileTree()); c.into(getOutputDir()); c.include(xpackPatternSet.getIncludes()); }); @@ -146,6 +146,6 @@ void copy() { } private SourceSet getTestSourceSet() { - return Boilerplate.getJavaSourceSets(getProject()).findByName("test"); + return GradleUtils.getJavaSourceSets(getProject()).findByName("test"); } } diff --git a/buildSrc/src/main/java/org/elasticsearch/gradle/test/rest/RestResourcesPlugin.java b/buildSrc/src/main/java/org/elasticsearch/gradle/test/rest/RestResourcesPlugin.java index a512b9b1fc025..3cdc4d24dd3c1 100644 --- a/buildSrc/src/main/java/org/elasticsearch/gradle/test/rest/RestResourcesPlugin.java +++ b/buildSrc/src/main/java/org/elasticsearch/gradle/test/rest/RestResourcesPlugin.java @@ -22,6 +22,7 @@ import org.elasticsearch.gradle.info.BuildParams; import org.gradle.api.Plugin; import org.gradle.api.Project; +import org.gradle.api.artifacts.Configuration; import org.gradle.api.artifacts.Dependency; import org.gradle.api.provider.Provider; @@ -86,21 +87,31 @@ public class RestResourcesPlugin implements Plugin { public void apply(Project project) { RestResourcesExtension extension = project.getExtensions().create(EXTENSION_NAME, RestResourcesExtension.class); + // tests + Configuration testConfig = project.getConfigurations().create("restTestConfig"); + Configuration xpackTestConfig = project.getConfigurations().create("restXpackTest"); + project.getConfigurations().create("restTests"); + project.getConfigurations().create("restXpackTests"); Provider copyRestYamlTestTask = project.getTasks() .register("copyYamlTestsTask", CopyRestTestsTask.class, task -> { task.includeCore.set(extension.restTests.getIncludeCore()); task.includeXpack.set(extension.restTests.getIncludeXpack()); - task.coreConfig = project.getConfigurations().create("restTest"); + task.coreConfig = testConfig; if (BuildParams.isInternal()) { + // core Dependency restTestdependency = project.getDependencies() .project(Map.of("path", ":rest-api-spec", "configuration", "restTests")); project.getDependencies().add(task.coreConfig.getName(), restTestdependency); - - task.xpackConfig = project.getConfigurations().create("restXpackTest"); + // x-pack + task.xpackConfig = xpackTestConfig; Dependency restXPackTestdependency = project.getDependencies() .project(Map.of("path", ":x-pack:plugin", "configuration", "restXpackTests")); project.getDependencies().add(task.xpackConfig.getName(), restXPackTestdependency); task.dependsOn(task.xpackConfig); + // watcher + Dependency restWatcherTests = project.getDependencies() + .project(Map.of("path", ":x-pack:plugin:watcher:qa:rest", "configuration", "restXpackTests")); + project.getDependencies().add(task.xpackConfig.getName(), restWatcherTests); } else { Dependency dependency = project.getDependencies() .create("org.elasticsearch:rest-api-spec:" + VersionProperties.getElasticsearch()); @@ -109,18 +120,22 @@ public void apply(Project project) { task.dependsOn(task.coreConfig); }); + // api + Configuration specConfig = project.getConfigurations().create("restSpec"); // name chosen for passivity + Configuration xpackSpecConfig = project.getConfigurations().create("restXpackSpec"); + project.getConfigurations().create("restSpecs"); + project.getConfigurations().create("restXpackSpecs"); Provider copyRestYamlSpecTask = project.getTasks() .register("copyRestApiSpecsTask", CopyRestApiTask.class, task -> { task.includeCore.set(extension.restApi.getIncludeCore()); task.includeXpack.set(extension.restApi.getIncludeXpack()); task.dependsOn(copyRestYamlTestTask); - task.coreConfig = project.getConfigurations().create("restSpec"); + task.coreConfig = specConfig; if (BuildParams.isInternal()) { Dependency restSpecDependency = project.getDependencies() .project(Map.of("path", ":rest-api-spec", "configuration", "restSpecs")); project.getDependencies().add(task.coreConfig.getName(), restSpecDependency); - - task.xpackConfig = project.getConfigurations().create("restXpackSpec"); + task.xpackConfig = xpackSpecConfig; Dependency restXpackSpecDependency = project.getDependencies() .project(Map.of("path", ":x-pack:plugin", "configuration", "restXpackSpecs")); project.getDependencies().add(task.xpackConfig.getName(), restXpackSpecDependency); diff --git a/buildSrc/src/main/java/org/elasticsearch/gradle/testclusters/RestTestRunnerTask.java b/buildSrc/src/main/java/org/elasticsearch/gradle/testclusters/RestTestRunnerTask.java index 5cd88ea01a32a..62427bc1d0a66 100644 --- a/buildSrc/src/main/java/org/elasticsearch/gradle/testclusters/RestTestRunnerTask.java +++ b/buildSrc/src/main/java/org/elasticsearch/gradle/testclusters/RestTestRunnerTask.java @@ -1,6 +1,6 @@ package org.elasticsearch.gradle.testclusters; -import org.elasticsearch.gradle.tool.Boilerplate; +import org.elasticsearch.gradle.util.GradleUtils; import org.gradle.api.provider.Provider; import org.gradle.api.services.internal.BuildServiceRegistryInternal; import org.gradle.api.tasks.CacheableTask; @@ -62,7 +62,7 @@ public Collection getClusters() { public List getSharedResources() { List locks = new ArrayList<>(super.getSharedResources()); BuildServiceRegistryInternal serviceRegistry = getServices().get(BuildServiceRegistryInternal.class); - Provider throttleProvider = Boilerplate.getBuildService(serviceRegistry, THROTTLE_SERVICE_NAME); + Provider throttleProvider = GradleUtils.getBuildService(serviceRegistry, THROTTLE_SERVICE_NAME); SharedResource resource = serviceRegistry.forService(throttleProvider); int nodeCount = clusters.stream().mapToInt(cluster -> cluster.getNodes().size()).sum(); diff --git a/buildSrc/src/main/java/org/elasticsearch/gradle/testclusters/TestClustersPlugin.java b/buildSrc/src/main/java/org/elasticsearch/gradle/testclusters/TestClustersPlugin.java index ebdf24d0a318f..48c436c42e03b 100644 --- a/buildSrc/src/main/java/org/elasticsearch/gradle/testclusters/TestClustersPlugin.java +++ b/buildSrc/src/main/java/org/elasticsearch/gradle/testclusters/TestClustersPlugin.java @@ -21,7 +21,7 @@ import org.elasticsearch.gradle.DistributionDownloadPlugin; import org.elasticsearch.gradle.ReaperPlugin; import org.elasticsearch.gradle.ReaperService; -import org.elasticsearch.gradle.tool.Boilerplate; +import org.elasticsearch.gradle.util.GradleUtils; import org.gradle.api.NamedDomainObjectContainer; import org.gradle.api.Plugin; import org.gradle.api.Project; @@ -36,7 +36,7 @@ import java.io.File; -import static org.elasticsearch.gradle.tool.Boilerplate.noop; +import static org.elasticsearch.gradle.util.GradleUtils.noop; public class TestClustersPlugin implements Plugin { @@ -102,7 +102,7 @@ public void apply(Project project) { throw new IllegalStateException(this.getClass().getName() + " can only be applied to the root project."); } - Provider registryProvider = Boilerplate.getBuildService( + Provider registryProvider = GradleUtils.getBuildService( project.getGradle().getSharedServices(), REGISTRY_SERVICE_NAME ); diff --git a/buildSrc/src/main/java/org/elasticsearch/gradle/testfixtures/TestFixturesPlugin.java b/buildSrc/src/main/java/org/elasticsearch/gradle/testfixtures/TestFixturesPlugin.java index 64ea96e9f8827..88e744043b747 100644 --- a/buildSrc/src/main/java/org/elasticsearch/gradle/testfixtures/TestFixturesPlugin.java +++ b/buildSrc/src/main/java/org/elasticsearch/gradle/testfixtures/TestFixturesPlugin.java @@ -29,7 +29,7 @@ import org.elasticsearch.gradle.docker.DockerSupportService; import org.elasticsearch.gradle.info.BuildParams; import org.elasticsearch.gradle.precommit.TestingConventionsTasks; -import org.elasticsearch.gradle.tool.Boilerplate; +import org.elasticsearch.gradle.util.GradleUtils; import org.gradle.api.Action; import org.gradle.api.DefaultTask; import org.gradle.api.Plugin; @@ -67,7 +67,7 @@ public void apply(Project project) { .getSharedServices() .registerIfAbsent(DOCKER_COMPOSE_THROTTLE, DockerComposeThrottle.class, spec -> spec.getMaxParallelUsages().set(1)); - Provider dockerSupport = Boilerplate.getBuildService( + Provider dockerSupport = GradleUtils.getBuildService( project.getGradle().getSharedServices(), DockerSupportPlugin.DOCKER_SUPPORT_SERVICE_NAME ); diff --git a/buildSrc/src/main/java/org/elasticsearch/gradle/tool/Boilerplate.java b/buildSrc/src/main/java/org/elasticsearch/gradle/util/GradleUtils.java similarity index 98% rename from buildSrc/src/main/java/org/elasticsearch/gradle/tool/Boilerplate.java rename to buildSrc/src/main/java/org/elasticsearch/gradle/util/GradleUtils.java index 85002b9c49821..81de7ea8176c3 100644 --- a/buildSrc/src/main/java/org/elasticsearch/gradle/tool/Boilerplate.java +++ b/buildSrc/src/main/java/org/elasticsearch/gradle/util/GradleUtils.java @@ -16,7 +16,7 @@ * specific language governing permissions and limitations * under the License. */ -package org.elasticsearch.gradle.tool; +package org.elasticsearch.gradle.util; import org.gradle.api.Action; import org.gradle.api.GradleException; @@ -36,7 +36,7 @@ import java.util.Optional; -public abstract class Boilerplate { +public abstract class GradleUtils { public static Action noop() { return t -> {}; diff --git a/buildSrc/src/main/java/org/elasticsearch/gradle/Util.java b/buildSrc/src/main/java/org/elasticsearch/gradle/util/Util.java similarity index 60% rename from buildSrc/src/main/java/org/elasticsearch/gradle/Util.java rename to buildSrc/src/main/java/org/elasticsearch/gradle/util/Util.java index 9ca4efc8b9f37..03ebe906625b6 100644 --- a/buildSrc/src/main/java/org/elasticsearch/gradle/Util.java +++ b/buildSrc/src/main/java/org/elasticsearch/gradle/util/Util.java @@ -7,7 +7,7 @@ * not use this file except in compliance with the License. * You may obtain a copy of the License at * - * http://www.apache.org/licenses/LICENSE-2.0 + * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, * software distributed under the License is distributed on an @@ -17,10 +17,15 @@ * under the License. */ -package org.elasticsearch.gradle; +package org.elasticsearch.gradle.util; +import org.elasticsearch.gradle.info.GlobalBuildInfoPlugin; import org.gradle.api.GradleException; +import java.io.BufferedReader; +import java.io.IOException; +import java.io.InputStreamReader; +import java.io.UncheckedIOException; import java.util.Locale; public class Util { @@ -39,6 +44,24 @@ public static boolean getBooleanProperty(String property, boolean defaultValue) } } + public static String getResourceContents(String resourcePath) { + try ( + BufferedReader reader = new BufferedReader(new InputStreamReader(GlobalBuildInfoPlugin.class.getResourceAsStream(resourcePath))) + ) { + StringBuilder b = new StringBuilder(); + for (String line = reader.readLine(); line != null; line = reader.readLine()) { + if (b.length() != 0) { + b.append('\n'); + } + b.append(line); + } + + return b.toString(); + } catch (IOException e) { + throw new UncheckedIOException("Error trying to read classpath resource: " + resourcePath, e); + } + } + public static String capitalize(String s) { return s.substring(0, 1).toUpperCase(Locale.ROOT) + s.substring(1); } diff --git a/buildSrc/src/main/java/org/elasticsearch/gradle/vagrant/VagrantMachine.java b/buildSrc/src/main/java/org/elasticsearch/gradle/vagrant/VagrantMachine.java index 945a5d4ed0374..507b977569c0c 100644 --- a/buildSrc/src/main/java/org/elasticsearch/gradle/vagrant/VagrantMachine.java +++ b/buildSrc/src/main/java/org/elasticsearch/gradle/vagrant/VagrantMachine.java @@ -23,7 +23,7 @@ import org.elasticsearch.gradle.LoggedExec; import org.elasticsearch.gradle.LoggingOutputStream; import org.elasticsearch.gradle.ReaperService; -import org.elasticsearch.gradle.Util; +import org.elasticsearch.gradle.util.Util; import org.gradle.api.Action; import org.gradle.api.Project; import org.gradle.internal.logging.progress.ProgressLogger; diff --git a/buildSrc/src/main/resources/checkstyle.xml b/buildSrc/src/main/resources/checkstyle.xml index 7a59ee5e0fa8b..bb4e0a124b69c 100644 --- a/buildSrc/src/main/resources/checkstyle.xml +++ b/buildSrc/src/main/resources/checkstyle.xml @@ -26,7 +26,7 @@ --> - + diff --git a/buildSrc/src/main/resources/checkstyle_suppressions.xml b/buildSrc/src/main/resources/checkstyle_suppressions.xml index 7bf5087485733..d79d6a95e184e 100644 --- a/buildSrc/src/main/resources/checkstyle_suppressions.xml +++ b/buildSrc/src/main/resources/checkstyle_suppressions.xml @@ -31,6 +31,7 @@ --> + diff --git a/buildSrc/src/minimumRuntime/java/org/elasticsearch/gradle/info/BuildParams.java b/buildSrc/src/minimumRuntime/java/org/elasticsearch/gradle/info/BuildParams.java index a76413d6b47d4..624a3e6f5af61 100644 --- a/buildSrc/src/minimumRuntime/java/org/elasticsearch/gradle/info/BuildParams.java +++ b/buildSrc/src/minimumRuntime/java/org/elasticsearch/gradle/info/BuildParams.java @@ -3,11 +3,6 @@ import org.gradle.api.JavaVersion; import java.io.File; -import java.lang.annotation.Documented; -import java.lang.annotation.ElementType; -import java.lang.annotation.Retention; -import java.lang.annotation.RetentionPolicy; -import java.lang.annotation.Target; import java.lang.reflect.Modifier; import java.time.ZonedDateTime; import java.util.Arrays; @@ -75,12 +70,10 @@ public static JavaVersion getGradleJavaVersion() { return value(gradleJavaVersion); } - @ExecutionTime public static JavaVersion getCompilerJavaVersion() { return value(compilerJavaVersion); } - @ExecutionTime public static JavaVersion getRuntimeJavaVersion() { return value(runtimeJavaVersion); } @@ -120,22 +113,13 @@ public static boolean isSnapshotBuild() { private static T value(T object) { if (object == null) { String callingMethod = Thread.currentThread().getStackTrace()[2].getMethodName(); - boolean executionTime; - try { - executionTime = BuildParams.class.getMethod(callingMethod).getAnnotation(ExecutionTime.class) != null; - } catch (NoSuchMethodException e) { - throw new RuntimeException(e); - } - String message = "Build parameter '" + propertyName(callingMethod) + "' has not been initialized. "; - if (executionTime) { - message += "This property is initialized at execution time, " - + "please ensure you are not attempting to access it during project configuration."; - } else { - message += "Perhaps the plugin responsible for initializing this property has not been applied."; - } - - throw new IllegalStateException(message); + throw new IllegalStateException( + "Build parameter '" + + propertyName(callingMethod) + + "' has not been initialized.\n" + + "Perhaps the plugin responsible for initializing this property has not been applied." + ); } return object; @@ -236,14 +220,4 @@ public void setIsSnapshotBuild(final boolean isSnapshotBuild) { } } - - /** - * Indicates that a build parameter is initialized at task execution time and is not available at project configuration time. - * Attempts to read an uninitialized parameter wil result in an {@link IllegalStateException}. - */ - @Target({ ElementType.METHOD, ElementType.FIELD }) - @Retention(RetentionPolicy.RUNTIME) - @Documented - public @interface ExecutionTime { - } } diff --git a/buildSrc/src/minimumRuntime/java/org/elasticsearch/gradle/info/JavaHome.java b/buildSrc/src/minimumRuntime/java/org/elasticsearch/gradle/info/JavaHome.java index 29ca2bafc79dc..1fe376545eaac 100644 --- a/buildSrc/src/minimumRuntime/java/org/elasticsearch/gradle/info/JavaHome.java +++ b/buildSrc/src/minimumRuntime/java/org/elasticsearch/gradle/info/JavaHome.java @@ -1,35 +1,27 @@ package org.elasticsearch.gradle.info; -import org.gradle.api.tasks.Input; -import org.gradle.api.tasks.InputDirectory; -import org.gradle.api.tasks.Optional; -import org.gradle.api.tasks.PathSensitive; -import org.gradle.api.tasks.PathSensitivity; +import org.gradle.api.provider.Provider; import java.io.File; public class JavaHome { private Integer version; - private File javaHome; + private Provider javaHome; - private JavaHome(int version, File javaHome) { + private JavaHome(int version, Provider javaHome) { this.version = version; this.javaHome = javaHome; } - public static JavaHome of(int version, File javaHome) { + public static JavaHome of(int version, Provider javaHome) { return new JavaHome(version, javaHome); } - @Input public Integer getVersion() { return version; } - @InputDirectory - @Optional - @PathSensitive(PathSensitivity.RELATIVE) - public File getJavaHome() { + public Provider getJavaHome() { return javaHome; } } diff --git a/buildSrc/src/test/java/org/elasticsearch/gradle/JdkDownloadPluginTests.java b/buildSrc/src/test/java/org/elasticsearch/gradle/JdkDownloadPluginTests.java index ea291858913f8..758aacbf90c8c 100644 --- a/buildSrc/src/test/java/org/elasticsearch/gradle/JdkDownloadPluginTests.java +++ b/buildSrc/src/test/java/org/elasticsearch/gradle/JdkDownloadPluginTests.java @@ -36,7 +36,7 @@ public static void setupRoot() { } public void testMissingVendor() { - assertJdkError(createProject(), "testjdk", null, "11.0.2+33", "linux", "vendor not specified for jdk [testjdk]"); + assertJdkError(createProject(), "testjdk", null, "11.0.2+33", "linux", "x64", "vendor not specified for jdk [testjdk]"); } public void testUnknownVendor() { @@ -46,20 +46,29 @@ public void testUnknownVendor() { "unknown", "11.0.2+33", "linux", + "x64", "unknown vendor [unknown] for jdk [testjdk], must be one of [adoptopenjdk, openjdk]" ); } public void testMissingVersion() { - assertJdkError(createProject(), "testjdk", "openjdk", null, "linux", "version not specified for jdk [testjdk]"); + assertJdkError(createProject(), "testjdk", "openjdk", null, "linux", "x64", "version not specified for jdk [testjdk]"); } public void testBadVersionFormat() { - assertJdkError(createProject(), "testjdk", "openjdk", "badversion", "linux", "malformed version [badversion] for jdk [testjdk]"); + assertJdkError( + createProject(), + "testjdk", + "openjdk", + "badversion", + "linux", + "x64", + "malformed version [badversion] for jdk [testjdk]" + ); } public void testMissingPlatform() { - assertJdkError(createProject(), "testjdk", "openjdk", "11.0.2+33", null, "platform not specified for jdk [testjdk]"); + assertJdkError(createProject(), "testjdk", "openjdk", "11.0.2+33", null, "x64", "platform not specified for jdk [testjdk]"); } public void testUnknownPlatform() { @@ -69,19 +78,44 @@ public void testUnknownPlatform() { "openjdk", "11.0.2+33", "unknown", + "x64", "unknown platform [unknown] for jdk [testjdk], must be one of [darwin, linux, windows, mac]" ); } - private void assertJdkError(Project project, String name, String vendor, String version, String platform, String message) { + public void testMissingArchitecture() { + assertJdkError(createProject(), "testjdk", "openjdk", "11.0.2+33", "linux", null, "architecture not specified for jdk [testjdk]"); + } + + public void testUnknownArchitecture() { + assertJdkError( + createProject(), + "testjdk", + "openjdk", + "11.0.2+33", + "linux", + "unknown", + "unknown architecture [unknown] for jdk [testjdk], must be one of [aarch64, x64]" + ); + } + + private void assertJdkError( + final Project project, + final String name, + final String vendor, + final String version, + final String platform, + final String architecture, + final String message + ) { IllegalArgumentException e = expectThrows( IllegalArgumentException.class, - () -> createJdk(project, name, vendor, version, platform) + () -> createJdk(project, name, vendor, version, platform, architecture) ); assertThat(e.getMessage(), equalTo(message)); } - private void createJdk(Project project, String name, String vendor, String version, String platform) { + private void createJdk(Project project, String name, String vendor, String version, String platform, String architecture) { @SuppressWarnings("unchecked") NamedDomainObjectContainer jdks = (NamedDomainObjectContainer) project.getExtensions().getByName("jdks"); jdks.create(name, jdk -> { @@ -94,6 +128,9 @@ private void createJdk(Project project, String name, String vendor, String versi if (platform != null) { jdk.setPlatform(platform); } + if (architecture != null) { + jdk.setArchitecture(architecture); + } }).finalizeValues(); } diff --git a/buildSrc/src/testKit/jdk-download/reuse/build.gradle b/buildSrc/src/testKit/jdk-download/reuse/build.gradle index 795098c4b5229..39a5c3372d5f5 100644 --- a/buildSrc/src/testKit/jdk-download/reuse/build.gradle +++ b/buildSrc/src/testKit/jdk-download/reuse/build.gradle @@ -7,5 +7,6 @@ jdks { vendor = fakeJdkVendor version = fakeJdkVersion platform = "linux" + architecture = "x64" } } diff --git a/buildSrc/src/testKit/jdk-download/subproj/build.gradle b/buildSrc/src/testKit/jdk-download/subproj/build.gradle index a0713bef204d9..a19a6bead4c7b 100644 --- a/buildSrc/src/testKit/jdk-download/subproj/build.gradle +++ b/buildSrc/src/testKit/jdk-download/subproj/build.gradle @@ -10,16 +10,19 @@ jdks { vendor = fakeJdkVendor version = fakeJdkVersion platform = "linux" + architecture = "x64" } darwin { vendor = fakeJdkVendor version = fakeJdkVersion platform = "darwin" + architecture = "x64" } windows { vendor = fakeJdkVendor version = fakeJdkVersion platform = "windows" + architecture = "x64" } } diff --git a/buildSrc/version.properties b/buildSrc/version.properties index dea67de1657d1..2ffc9101b9d80 100644 --- a/buildSrc/version.properties +++ b/buildSrc/version.properties @@ -1,18 +1,14 @@ elasticsearch = 8.0.0 -lucene = 8.5.0-snapshot-7f057455901 +lucene = 8.5.0 bundled_jdk_vendor = adoptopenjdk -bundled_jdk = 13.0.2+8 +bundled_jdk = 14+36 # optional dependencies spatial4j = 0.7 jts = 1.15.0 -# note that ingest-geoip has a hard-coded version; if you modify this version, -# you should also inspect that version to see if it can be advanced along with -# the com.maxmind.geoip2:geoip2 dependency -jackson = 2.8.11 -jacksondatabind = 2.8.11.6 -snakeyaml = 1.17 +jackson = 2.10.3 +snakeyaml = 1.24 icu4j = 62.1 supercsv = 2.4.0 # when updating log4j, please update also docs/java-api/index.asciidoc diff --git a/client/rest-high-level/src/main/java/org/elasticsearch/client/AsyncSearchClient.java b/client/rest-high-level/src/main/java/org/elasticsearch/client/AsyncSearchClient.java new file mode 100644 index 0000000000000..bbf9c15e20cdf --- /dev/null +++ b/client/rest-high-level/src/main/java/org/elasticsearch/client/AsyncSearchClient.java @@ -0,0 +1,120 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ + +package org.elasticsearch.client; + +import org.elasticsearch.action.ActionListener; +import org.elasticsearch.client.asyncsearch.AsyncSearchResponse; +import org.elasticsearch.client.asyncsearch.DeleteAsyncSearchRequest; +import org.elasticsearch.client.asyncsearch.GetAsyncSearchRequest; +import org.elasticsearch.client.asyncsearch.SubmitAsyncSearchRequest; +import org.elasticsearch.client.core.AcknowledgedResponse; + +import java.io.IOException; + +import static java.util.Collections.emptySet; + +public class AsyncSearchClient { + private final RestHighLevelClient restHighLevelClient; + + AsyncSearchClient(RestHighLevelClient restHighLevelClient) { + this.restHighLevelClient = restHighLevelClient; + } + + /** + * Submit a new async search request. + * See the docs for more. + * @param request the request + * @param options the request options (e.g. headers), use {@link RequestOptions#DEFAULT} if nothing needs to be customized + * @return the response + * @throws IOException in case there is a problem sending the request or parsing back the response + */ + public AsyncSearchResponse submit(SubmitAsyncSearchRequest request, RequestOptions options) throws IOException { + return restHighLevelClient.performRequestAndParseEntity(request, AsyncSearchRequestConverters::submitAsyncSearch, options, + AsyncSearchResponse::fromXContent, emptySet()); + } + + /** + * Asynchronously submit a new async search request. + * See the docs for more. + * + * the docs for more. + * @param request the request + * @param options the request options (e.g. headers), use {@link RequestOptions#DEFAULT} if nothing needs to be customized + * @param listener the listener to be notified upon request completion + * @return cancellable that may be used to cancel the request + */ + public Cancellable submitAsync(SubmitAsyncSearchRequest request, RequestOptions options, + ActionListener listener) { + return restHighLevelClient.performRequestAsyncAndParseEntity(request, AsyncSearchRequestConverters::submitAsyncSearch, options, + AsyncSearchResponse::fromXContent, listener, emptySet()); + } + + /** + * Get an async search request. + * See the docs for more. + * + */ + public AsyncSearchResponse get(GetAsyncSearchRequest request, RequestOptions options) throws IOException { + return restHighLevelClient.performRequestAndParseEntity(request, AsyncSearchRequestConverters::getAsyncSearch, options, + AsyncSearchResponse::fromXContent, emptySet()); + } + + /** + * Asynchronously get an async search request. + * See the docs for more. + * @param request the request + * @param options the request options (e.g. headers), use {@link RequestOptions#DEFAULT} if nothing needs to be customized + * @param listener the listener to be notified upon request completion + * @return cancellable that may be used to cancel the request + */ + public Cancellable getAsync(GetAsyncSearchRequest request, RequestOptions options, + ActionListener listener) { + return restHighLevelClient.performRequestAsyncAndParseEntity(request, AsyncSearchRequestConverters::getAsyncSearch, options, + AsyncSearchResponse::fromXContent, listener, emptySet()); + } + + /** + * Delete an async search request. + * See the docs for more. + * @param request the request + * @param options the request options (e.g. headers), use {@link RequestOptions#DEFAULT} if nothing needs to be customized + * @return the response + * @throws IOException in case there is a problem sending the request or parsing back the response + */ + public AcknowledgedResponse delete(DeleteAsyncSearchRequest request, RequestOptions options) throws IOException { + return restHighLevelClient.performRequestAndParseEntity(request, AsyncSearchRequestConverters::deleteAsyncSearch, options, + AcknowledgedResponse::fromXContent, emptySet()); + } + + /** + * Asynchronously delete an async search request. + * See the docs for more. + * @param request the request + * @param options the request options (e.g. headers), use {@link RequestOptions#DEFAULT} if nothing needs to be customized + * @param listener the listener to be notified upon request completion + * @return cancellable that may be used to cancel the request + */ + public Cancellable deleteAsync(DeleteAsyncSearchRequest request, RequestOptions options, + ActionListener listener) { + return restHighLevelClient.performRequestAsyncAndParseEntity(request, AsyncSearchRequestConverters::deleteAsyncSearch, options, + AcknowledgedResponse::fromXContent, listener, emptySet()); + } + +} diff --git a/client/rest-high-level/src/main/java/org/elasticsearch/client/AsyncSearchRequestConverters.java b/client/rest-high-level/src/main/java/org/elasticsearch/client/AsyncSearchRequestConverters.java new file mode 100644 index 0000000000000..8a63589a55c51 --- /dev/null +++ b/client/rest-high-level/src/main/java/org/elasticsearch/client/AsyncSearchRequestConverters.java @@ -0,0 +1,103 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ + +package org.elasticsearch.client; + +import org.apache.http.client.methods.HttpDelete; +import org.apache.http.client.methods.HttpGet; +import org.apache.http.client.methods.HttpPost; +import org.elasticsearch.client.RequestConverters.Params; +import org.elasticsearch.client.asyncsearch.DeleteAsyncSearchRequest; +import org.elasticsearch.client.asyncsearch.GetAsyncSearchRequest; +import org.elasticsearch.client.asyncsearch.SubmitAsyncSearchRequest; +import org.elasticsearch.rest.action.search.RestSearchAction; + +import java.io.IOException; +import java.util.Locale; + +import static org.elasticsearch.client.RequestConverters.REQUEST_BODY_CONTENT_TYPE; + +final class AsyncSearchRequestConverters { + + static Request submitAsyncSearch(SubmitAsyncSearchRequest asyncSearchRequest) throws IOException { + String endpoint = new RequestConverters.EndpointBuilder().addCommaSeparatedPathParts( + asyncSearchRequest.getIndices()) + .addPathPartAsIs("_async_search").build(); + Request request = new Request(HttpPost.METHOD_NAME, endpoint); + Params params = new RequestConverters.Params(); + // add all typical search params and search request source as body + addSearchRequestParams(params, asyncSearchRequest); + if (asyncSearchRequest.getSearchSource() != null) { + request.setEntity(RequestConverters.createEntity(asyncSearchRequest.getSearchSource(), REQUEST_BODY_CONTENT_TYPE)); + } + // set async search submit specific parameters + if (asyncSearchRequest.isCleanOnCompletion() != null) { + params.putParam("clean_on_completion", asyncSearchRequest.isCleanOnCompletion().toString()); + } + if (asyncSearchRequest.getKeepAlive() != null) { + params.putParam("keep_alive", asyncSearchRequest.getKeepAlive().getStringRep()); + } + if (asyncSearchRequest.getWaitForCompletion() != null) { + params.putParam("wait_for_completion", asyncSearchRequest.getWaitForCompletion().getStringRep()); + } + request.addParameters(params.asMap()); + return request; + } + + static void addSearchRequestParams(Params params, SubmitAsyncSearchRequest request) { + params.putParam(RestSearchAction.TYPED_KEYS_PARAM, "true"); + params.withRouting(request.getRouting()); + params.withPreference(request.getPreference()); + params.withIndicesOptions(request.getIndicesOptions()); + params.withSearchType(request.getSearchType().name().toLowerCase(Locale.ROOT)); + params.withMaxConcurrentShardRequests(request.getMaxConcurrentShardRequests()); + if (request.getRequestCache() != null) { + params.withRequestCache(request.getRequestCache()); + } + if (request.getAllowPartialSearchResults() != null) { + params.withAllowPartialResults(request.getAllowPartialSearchResults()); + } + params.withBatchedReduceSize(request.getBatchedReduceSize()); + } + + static Request getAsyncSearch(GetAsyncSearchRequest asyncSearchRequest) throws IOException { + String endpoint = new RequestConverters.EndpointBuilder() + .addPathPartAsIs("_async_search") + .addPathPart(asyncSearchRequest.getId()) + .build(); + Request request = new Request(HttpGet.METHOD_NAME, endpoint); + Params params = new RequestConverters.Params(); + if (asyncSearchRequest.getKeepAlive() != null) { + params.putParam("keep_alive", asyncSearchRequest.getKeepAlive().getStringRep()); + } + if (asyncSearchRequest.getWaitForCompletion() != null) { + params.putParam("wait_for_completion", asyncSearchRequest.getWaitForCompletion().getStringRep()); + } + request.addParameters(params.asMap()); + return request; + } + + static Request deleteAsyncSearch(DeleteAsyncSearchRequest deleteAsyncSearchRequest) throws IOException { + String endpoint = new RequestConverters.EndpointBuilder() + .addPathPartAsIs("_async_search") + .addPathPart(deleteAsyncSearchRequest.getId()) + .build(); + return new Request(HttpDelete.METHOD_NAME, endpoint); + } +} diff --git a/client/rest-high-level/src/main/java/org/elasticsearch/client/RequestConverters.java b/client/rest-high-level/src/main/java/org/elasticsearch/client/RequestConverters.java index b9b3c4b31a414..d0cd3ea5a0091 100644 --- a/client/rest-high-level/src/main/java/org/elasticsearch/client/RequestConverters.java +++ b/client/rest-high-level/src/main/java/org/elasticsearch/client/RequestConverters.java @@ -403,20 +403,24 @@ static Request search(SearchRequest searchRequest, String searchEndpoint) throws return request; } - private static void addSearchRequestParams(Params params, SearchRequest searchRequest) { + static void addSearchRequestParams(Params params, SearchRequest searchRequest) { params.putParam(RestSearchAction.TYPED_KEYS_PARAM, "true"); params.withRouting(searchRequest.routing()); params.withPreference(searchRequest.preference()); params.withIndicesOptions(searchRequest.indicesOptions()); - params.putParam("search_type", searchRequest.searchType().name().toLowerCase(Locale.ROOT)); + params.withSearchType(searchRequest.searchType().name().toLowerCase(Locale.ROOT)); params.putParam("ccs_minimize_roundtrips", Boolean.toString(searchRequest.isCcsMinimizeRoundtrips())); + if (searchRequest.getPreFilterShardSize() != null) { + params.putParam("pre_filter_shard_size", Integer.toString(searchRequest.getPreFilterShardSize())); + } + params.withMaxConcurrentShardRequests(searchRequest.getMaxConcurrentShardRequests()); if (searchRequest.requestCache() != null) { - params.putParam("request_cache", Boolean.toString(searchRequest.requestCache())); + params.withRequestCache(searchRequest.requestCache()); } if (searchRequest.allowPartialSearchResults() != null) { - params.putParam("allow_partial_search_results", Boolean.toString(searchRequest.allowPartialSearchResults())); + params.withAllowPartialResults(searchRequest.allowPartialSearchResults()); } - params.putParam("batched_reduce_size", Integer.toString(searchRequest.getBatchedReduceSize())); + params.withBatchedReduceSize(searchRequest.getBatchedReduceSize()); if (searchRequest.scroll() != null) { params.putParam("scroll", searchRequest.scroll().keepAlive()); } @@ -858,6 +862,26 @@ Params withPreference(String preference) { return putParam("preference", preference); } + Params withSearchType(String searchType) { + return putParam("search_type", searchType); + } + + Params withMaxConcurrentShardRequests(int maxConcurrentShardRequests) { + return putParam("max_concurrent_shard_requests", Integer.toString(maxConcurrentShardRequests)); + } + + Params withBatchedReduceSize(int batchedReduceSize) { + return putParam("batched_reduce_size", Integer.toString(batchedReduceSize)); + } + + Params withRequestCache(boolean requestCache) { + return putParam("request_cache", Boolean.toString(requestCache)); + } + + Params withAllowPartialResults(boolean allowPartialSearchResults) { + return putParam("allow_partial_search_results", Boolean.toString(allowPartialSearchResults)); + } + Params withRealtime(boolean realtime) { if (realtime == false) { return putParam("realtime", Boolean.FALSE.toString()); diff --git a/client/rest-high-level/src/main/java/org/elasticsearch/client/RestHighLevelClient.java b/client/rest-high-level/src/main/java/org/elasticsearch/client/RestHighLevelClient.java index b464c2166f865..8b5262d6aada6 100644 --- a/client/rest-high-level/src/main/java/org/elasticsearch/client/RestHighLevelClient.java +++ b/client/rest-high-level/src/main/java/org/elasticsearch/client/RestHighLevelClient.java @@ -265,6 +265,7 @@ public class RestHighLevelClient implements Closeable { private final TransformClient transformClient = new TransformClient(this); private final EnrichClient enrichClient = new EnrichClient(this); private final EqlClient eqlClient = new EqlClient(this); + private final AsyncSearchClient asyncSearchClient = new AsyncSearchClient(this); /** * Creates a {@link RestHighLevelClient} given the low level {@link RestClientBuilder} that allows to build the @@ -428,13 +429,23 @@ public final XPackClient xpack() { * A wrapper for the {@link RestHighLevelClient} that provides methods for * accessing the Elastic Index Lifecycle APIs. *

- * See the X-Pack APIs + * See the X-Pack APIs * on elastic.co for more information. */ public IndexLifecycleClient indexLifecycle() { return ilmClient; } + /** + * A wrapper for the {@link RestHighLevelClient} that provides methods for accessing the Elastic Index Async Search APIs. + *

+ * See the X-Pack APIs on elastic.co + * for more information. + */ + public AsyncSearchClient asyncSearch() { + return asyncSearchClient; + } + /** * Provides methods for accessing the Elastic Licensed Migration APIs that * are shipped with the default distribution of Elasticsearch. All of @@ -1888,12 +1899,7 @@ protected static boolean convertExistsResponse(Response response) { * emitted there just mean that you are talking to an old version of * Elasticsearch. There isn't anything you can do about the deprecation. */ - private static final DeprecationHandler DEPRECATION_HANDLER = new DeprecationHandler() { - @Override - public void usedDeprecatedName(String usedName, String modernName) {} - @Override - public void usedDeprecatedField(String usedName, String replacedWith) {} - }; + private static final DeprecationHandler DEPRECATION_HANDLER = DeprecationHandler.IGNORE_DEPRECATIONS; static List getDefaultNamedXContents() { Map> map = new HashMap<>(); diff --git a/client/rest-high-level/src/main/java/org/elasticsearch/client/asyncsearch/AsyncSearchResponse.java b/client/rest-high-level/src/main/java/org/elasticsearch/client/asyncsearch/AsyncSearchResponse.java new file mode 100644 index 0000000000000..07d3ce81fea8c --- /dev/null +++ b/client/rest-high-level/src/main/java/org/elasticsearch/client/asyncsearch/AsyncSearchResponse.java @@ -0,0 +1,200 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.elasticsearch.client.asyncsearch; + +import org.elasticsearch.ElasticsearchException; +import org.elasticsearch.action.search.SearchResponse; +import org.elasticsearch.common.Nullable; +import org.elasticsearch.common.ParseField; +import org.elasticsearch.common.Strings; +import org.elasticsearch.common.xcontent.ConstructingObjectParser; +import org.elasticsearch.common.xcontent.ToXContentObject; +import org.elasticsearch.common.xcontent.XContentBuilder; +import org.elasticsearch.common.xcontent.XContentParser; +import org.elasticsearch.common.xcontent.XContentParser.Token; + +import java.io.IOException; + +import static org.elasticsearch.common.xcontent.ConstructingObjectParser.constructorArg; +import static org.elasticsearch.common.xcontent.ConstructingObjectParser.optionalConstructorArg; +import static org.elasticsearch.common.xcontent.XContentParserUtils.ensureExpectedToken; + +/** + * A response of an async search request. + */ +public class AsyncSearchResponse implements ToXContentObject { + @Nullable + private final String id; + @Nullable + private final SearchResponse searchResponse; + @Nullable + private final ElasticsearchException error; + private final boolean isRunning; + private final boolean isPartial; + + private final long startTimeMillis; + private final long expirationTimeMillis; + + /** + * Creates an {@link AsyncSearchResponse} with the arguments that are always present in the server response + */ + AsyncSearchResponse(boolean isPartial, + boolean isRunning, + long startTimeMillis, + long expirationTimeMillis, + @Nullable String id, + @Nullable SearchResponse searchResponse, + @Nullable ElasticsearchException error) { + this.isPartial = isPartial; + this.isRunning = isRunning; + this.startTimeMillis = startTimeMillis; + this.expirationTimeMillis = expirationTimeMillis; + this.id = id; + this.searchResponse = searchResponse; + this.error = error; + } + + /** + * Returns the id of the async search request or null if the response is not stored in the cluster. + */ + @Nullable + public String getId() { + return id; + } + + /** + * Returns the current {@link SearchResponse} or null if not available. + * + * See {@link #isPartial()} to determine whether the response contains partial or complete + * results. + */ + public SearchResponse getSearchResponse() { + return searchResponse; + } + + /** + * Returns the failure reason or null if the query is running or has completed normally. + */ + public ElasticsearchException getFailure() { + return error; + } + + /** + * Returns true if the {@link SearchResponse} contains partial + * results computed from a subset of the total shards. + */ + public boolean isPartial() { + return isPartial; + } + + /** + * Whether the search is still running in the cluster. + * + * A value of false indicates that the response is final + * even if {@link #isPartial()} returns true. In such case, + * the partial response represents the status of the search before a + * non-recoverable failure. + */ + public boolean isRunning() { + return isRunning; + } + + /** + * When this response was created as a timestamp in milliseconds since epoch. + */ + public long getStartTime() { + return startTimeMillis; + } + + /** + * When this response will expired as a timestamp in milliseconds since epoch. + */ + public long getExpirationTime() { + return expirationTimeMillis; + } + + @Override + public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { + builder.startObject(); + if (id != null) { + builder.field("id", id); + } + builder.field("is_partial", isPartial); + builder.field("is_running", isRunning); + builder.field("start_time_in_millis", startTimeMillis); + builder.field("expiration_time_in_millis", expirationTimeMillis); + + if (searchResponse != null) { + builder.field("response"); + searchResponse.toXContent(builder, params); + } + if (error != null) { + builder.startObject("error"); + error.toXContent(builder, params); + builder.endObject(); + } + builder.endObject(); + return builder; + } + + public static final ParseField ID_FIELD = new ParseField("id"); + public static final ParseField IS_PARTIAL_FIELD = new ParseField("is_partial"); + public static final ParseField IS_RUNNING_FIELD = new ParseField("is_running"); + public static final ParseField START_TIME_FIELD = new ParseField("start_time_in_millis"); + public static final ParseField EXPIRATION_FIELD = new ParseField("expiration_time_in_millis"); + public static final ParseField RESPONSE_FIELD = new ParseField("response"); + public static final ParseField ERROR_FIELD = new ParseField("error"); + + public static final ConstructingObjectParser PARSER = new ConstructingObjectParser<>( + "submit_async_search_response", true, + args -> new AsyncSearchResponse( + (boolean) args[0], + (boolean) args[1], + (long) args[2], + (long) args[3], + (String) args[4], + (SearchResponse) args[5], + (ElasticsearchException) args[6])); + static { + PARSER.declareBoolean(constructorArg(), IS_PARTIAL_FIELD); + PARSER.declareBoolean(constructorArg(), IS_RUNNING_FIELD); + PARSER.declareLong(constructorArg(), START_TIME_FIELD); + PARSER.declareLong(constructorArg(), EXPIRATION_FIELD); + PARSER.declareString(optionalConstructorArg(), ID_FIELD); + PARSER.declareObject(optionalConstructorArg(), (p, c) -> AsyncSearchResponse.parseSearchResponse(p), + RESPONSE_FIELD); + PARSER.declareObject(optionalConstructorArg(), (p, c) -> ElasticsearchException.fromXContent(p), ERROR_FIELD); + } + + private static SearchResponse parseSearchResponse(XContentParser p) throws IOException { + // we should be before the opening START_OBJECT of the response + ensureExpectedToken(Token.START_OBJECT, p.currentToken(), p::getTokenLocation); + p.nextToken(); + return SearchResponse.innerFromXContent(p); + } + + public static AsyncSearchResponse fromXContent(XContentParser parser) { + return PARSER.apply(parser, null); + } + + @Override + public String toString() { + return Strings.toString(this); + } +} diff --git a/client/rest-high-level/src/main/java/org/elasticsearch/client/asyncsearch/DeleteAsyncSearchRequest.java b/client/rest-high-level/src/main/java/org/elasticsearch/client/asyncsearch/DeleteAsyncSearchRequest.java new file mode 100644 index 0000000000000..3b37293212da0 --- /dev/null +++ b/client/rest-high-level/src/main/java/org/elasticsearch/client/asyncsearch/DeleteAsyncSearchRequest.java @@ -0,0 +1,55 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ + + +package org.elasticsearch.client.asyncsearch; + +import org.elasticsearch.client.Validatable; + +import java.util.Objects; + +public class DeleteAsyncSearchRequest implements Validatable { + + private final String id; + + public DeleteAsyncSearchRequest(String id) { + this.id = id; +} + + public String getId() { + return this.id; + } + + @Override + public boolean equals(Object o) { + if (this == o) { + return true; + } + if (o == null || getClass() != o.getClass()) { + return false; + } + DeleteAsyncSearchRequest request = (DeleteAsyncSearchRequest) o; + return Objects.equals(getId(), request.getId()); + } + + @Override + public int hashCode() { + return Objects.hash(getId()); + } +} diff --git a/client/rest-high-level/src/main/java/org/elasticsearch/client/asyncsearch/GetAsyncSearchRequest.java b/client/rest-high-level/src/main/java/org/elasticsearch/client/asyncsearch/GetAsyncSearchRequest.java new file mode 100644 index 0000000000000..11ad059349481 --- /dev/null +++ b/client/rest-high-level/src/main/java/org/elasticsearch/client/asyncsearch/GetAsyncSearchRequest.java @@ -0,0 +1,93 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ + + +package org.elasticsearch.client.asyncsearch; + +import org.elasticsearch.client.Validatable; +import org.elasticsearch.client.ValidationException; +import org.elasticsearch.common.unit.TimeValue; + +import java.util.Objects; +import java.util.Optional; + +public class GetAsyncSearchRequest implements Validatable { + + private TimeValue waitForCompletion; + private TimeValue keepAlive; + + public static final long MIN_KEEPALIVE = TimeValue.timeValueMinutes(1).millis(); + + private final String id; + + public GetAsyncSearchRequest(String id) { + this.id = id; + } + + public String getId() { + return this.id; + } + + public TimeValue getWaitForCompletion() { + return waitForCompletion; + } + + public void setWaitForCompletion(TimeValue waitForCompletion) { + this.waitForCompletion = waitForCompletion; + } + + public TimeValue getKeepAlive() { + return keepAlive; + } + + public void setKeepAlive(TimeValue keepAlive) { + this.keepAlive = keepAlive; + } + + @Override + public Optional validate() { + final ValidationException validationException = new ValidationException(); + if (keepAlive != null && keepAlive.getMillis() < MIN_KEEPALIVE) { + validationException.addValidationError("keep_alive must be greater than 1 minute, got: " + keepAlive.toString()); + } + if (validationException.validationErrors().isEmpty()) { + return Optional.empty(); + } + return Optional.of(validationException); + } + + @Override + public boolean equals(Object o) { + if (this == o) { + return true; + } + if (o == null || getClass() != o.getClass()) { + return false; + } + GetAsyncSearchRequest request = (GetAsyncSearchRequest) o; + return Objects.equals(getId(), request.getId()) + && Objects.equals(getKeepAlive(), request.getKeepAlive()) + && Objects.equals(getWaitForCompletion(), request.getWaitForCompletion()); + } + + @Override + public int hashCode() { + return Objects.hash(getId(), getKeepAlive(), getWaitForCompletion()); + } +} diff --git a/client/rest-high-level/src/main/java/org/elasticsearch/client/asyncsearch/SubmitAsyncSearchRequest.java b/client/rest-high-level/src/main/java/org/elasticsearch/client/asyncsearch/SubmitAsyncSearchRequest.java new file mode 100644 index 0000000000000..1b0a07c4dea41 --- /dev/null +++ b/client/rest-high-level/src/main/java/org/elasticsearch/client/asyncsearch/SubmitAsyncSearchRequest.java @@ -0,0 +1,284 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ + + +package org.elasticsearch.client.asyncsearch; + +import org.elasticsearch.action.search.SearchRequest; +import org.elasticsearch.action.search.SearchType; +import org.elasticsearch.action.support.IndicesOptions; +import org.elasticsearch.client.Validatable; +import org.elasticsearch.client.ValidationException; +import org.elasticsearch.common.unit.TimeValue; +import org.elasticsearch.search.builder.SearchSourceBuilder; + +import java.util.Objects; +import java.util.Optional; + +/** + * A request to track asynchronously the progress of a search against one or more indices. + */ +public class SubmitAsyncSearchRequest implements Validatable { + + public static final int DEFAULT_PRE_FILTER_SHARD_SIZE = 1; + public static final int DEFAULT_BATCHED_REDUCE_SIZE = 5; + private static final boolean DEFAULT_CCS_MINIMIZE_ROUNDTRIPS = false; + private static final boolean DEFAULT_REQUEST_CACHE_VALUE = true; + + public static long MIN_KEEP_ALIVE = TimeValue.timeValueMinutes(1).millis(); + + private TimeValue waitForCompletion; + private Boolean cleanOnCompletion; + private TimeValue keepAlive; + private final SearchRequest searchRequest; + + /** + * Creates a new request + */ + public SubmitAsyncSearchRequest(SearchSourceBuilder source, String... indices) { + this.searchRequest = new SearchRequest(indices, source); + searchRequest.setCcsMinimizeRoundtrips(DEFAULT_CCS_MINIMIZE_ROUNDTRIPS); + searchRequest.setPreFilterShardSize(DEFAULT_PRE_FILTER_SHARD_SIZE); + searchRequest.setBatchedReduceSize(DEFAULT_BATCHED_REDUCE_SIZE); + searchRequest.requestCache(DEFAULT_REQUEST_CACHE_VALUE); + } + + /** + * Get the target indices + */ + public String[] getIndices() { + return this.searchRequest.indices(); + } + + + /** + * Get the minimum time that the request should wait before returning a partial result (defaults to 1 second). + */ + public TimeValue getWaitForCompletion() { + return waitForCompletion; + } + + /** + * Sets the minimum time that the request should wait before returning a partial result (defaults to 1 second). + */ + public void setWaitForCompletion(TimeValue waitForCompletion) { + this.waitForCompletion = waitForCompletion; + } + + /** + * Returns whether the resource resource should be removed on completion or failure (defaults to true). + */ + public Boolean isCleanOnCompletion() { + return cleanOnCompletion; + } + + /** + * Determines if the resource should be removed on completion or failure (defaults to true). + */ + public void setCleanOnCompletion(boolean cleanOnCompletion) { + this.cleanOnCompletion = cleanOnCompletion; + } + + /** + * Get the amount of time after which the result will expire (defaults to 5 days). + */ + public TimeValue getKeepAlive() { + return keepAlive; + } + + /** + * Sets the amount of time after which the result will expire (defaults to 5 days). + */ + public void setKeepAlive(TimeValue keepAlive) { + this.keepAlive = keepAlive; + } + + // setters for request parameters of the wrapped SearchRequest + /** + * Set the routing value to control the shards that the search will be executed on. + * A comma separated list of routing values to control the shards the search will be executed on. + */ + public void setRouting(String routing) { + this.searchRequest.routing(routing); + } + + /** + * Set the routing values to control the shards that the search will be executed on. + */ + public void setRoutings(String... routings) { + this.searchRequest.routing(routings); + } + + /** + * Get the routing value to control the shards that the search will be executed on. + */ + public String getRouting() { + return this.searchRequest.routing(); + } + + /** + * Sets the preference to execute the search. Defaults to randomize across shards. Can be set to + * {@code _local} to prefer local shards or a custom value, which guarantees that the same order + * will be used across different requests. + */ + public void setPreference(String preference) { + this.searchRequest.preference(preference); + } + + /** + * Get the preference to execute the search. + */ + public String getPreference() { + return this.searchRequest.preference(); + } + + /** + * Specifies what type of requested indices to ignore and how to deal with indices wildcard expressions. + */ + public void setIndicesOptions(IndicesOptions indicesOptions) { + this.searchRequest.indicesOptions(indicesOptions); + } + + /** + * Get the indices Options. + */ + public IndicesOptions getIndicesOptions() { + return this.searchRequest.indicesOptions(); + } + + /** + * The search type to execute, defaults to {@link SearchType#DEFAULT}. + */ + public void setSearchType(SearchType searchType) { + this.searchRequest.searchType(searchType); + } + + /** + * Get the search type to execute, defaults to {@link SearchType#DEFAULT}. + */ + public SearchType getSearchType() { + return this.searchRequest.searchType(); + } + + /** + * Sets if this request should allow partial results. (If method is not called, + * will default to the cluster level setting). + */ + public void setAllowPartialSearchResults(boolean allowPartialSearchResults) { + this.searchRequest.allowPartialSearchResults(allowPartialSearchResults); + } + + /** + * Gets if this request should allow partial results. + */ + public Boolean getAllowPartialSearchResults() { + return this.searchRequest.allowPartialSearchResults(); + } + + /** + * Sets the number of shard results that should be reduced at once on the coordinating node. This value should be used as a protection + * mechanism to reduce the memory overhead per search request if the potential number of shards in the request can be large. + */ + public void setBatchedReduceSize(int batchedReduceSize) { + this.searchRequest.setBatchedReduceSize(batchedReduceSize); + } + + /** + * Gets the number of shard results that should be reduced at once on the coordinating node. + * This defaults to 5 for {@link SubmitAsyncSearchRequest}. + */ + public int getBatchedReduceSize() { + return this.searchRequest.getBatchedReduceSize(); + } + + /** + * Sets if this request should use the request cache or not, assuming that it can (for + * example, if "now" is used, it will never be cached). By default (not set, or null, + * will default to the index level setting if request cache is enabled or not). + */ + public void setRequestCache(Boolean requestCache) { + this.searchRequest.requestCache(requestCache); + } + + /** + * Gets if this request should use the request cache or not. + * Defaults to `true` for {@link SubmitAsyncSearchRequest}. + */ + public Boolean getRequestCache() { + return this.searchRequest.requestCache(); + } + + /** + * Returns the number of shard requests that should be executed concurrently on a single node. + * The default is {@code 5}. + */ + public int getMaxConcurrentShardRequests() { + return this.searchRequest.getMaxConcurrentShardRequests(); + } + + /** + * Sets the number of shard requests that should be executed concurrently on a single node. + * The default is {@code 5}. + */ + public void setMaxConcurrentShardRequests(int maxConcurrentShardRequests) { + this.searchRequest.setMaxConcurrentShardRequests(maxConcurrentShardRequests); + } + + /** + * Gets if the source of the {@link SearchSourceBuilder} initially used on this request. + */ + public SearchSourceBuilder getSearchSource() { + return this.searchRequest.source(); + } + + @Override + public Optional validate() { + final ValidationException validationException = new ValidationException(); + if (searchRequest.isSuggestOnly()) { + validationException.addValidationError("suggest-only queries are not supported"); + } + if (keepAlive != null && keepAlive.getMillis() < MIN_KEEP_ALIVE) { + validationException.addValidationError("[keep_alive] must be greater than 1 minute, got: " + keepAlive.toString()); + } + if (validationException.validationErrors().isEmpty()) { + return Optional.empty(); + } + return Optional.of(validationException); + } + + @Override + public boolean equals(Object o) { + if (this == o) { + return true; + } + if (o == null || getClass() != o.getClass()) { + return false; + } + SubmitAsyncSearchRequest request = (SubmitAsyncSearchRequest) o; + return Objects.equals(searchRequest, request.searchRequest) + && Objects.equals(getKeepAlive(), request.getKeepAlive()) + && Objects.equals(getWaitForCompletion(), request.getWaitForCompletion()) + && Objects.equals(isCleanOnCompletion(), request.isCleanOnCompletion()); + } + + @Override + public int hashCode() { + return Objects.hash(searchRequest, getKeepAlive(), getWaitForCompletion(), isCleanOnCompletion()); + } +} diff --git a/client/rest-high-level/src/main/java/org/elasticsearch/client/core/IndexerJobStats.java b/client/rest-high-level/src/main/java/org/elasticsearch/client/core/IndexerJobStats.java index 5e59b4b19dbbe..dc332fa8a4ab0 100644 --- a/client/rest-high-level/src/main/java/org/elasticsearch/client/core/IndexerJobStats.java +++ b/client/rest-high-level/src/main/java/org/elasticsearch/client/core/IndexerJobStats.java @@ -31,8 +31,10 @@ public abstract class IndexerJobStats { public static ParseField NUM_INVOCATIONS = new ParseField("trigger_count"); public static ParseField INDEX_TIME_IN_MS = new ParseField("index_time_in_ms"); public static ParseField SEARCH_TIME_IN_MS = new ParseField("search_time_in_ms"); + public static ParseField PROCESSING_TIME_IN_MS = new ParseField("processing_time_in_ms"); public static ParseField INDEX_TOTAL = new ParseField("index_total"); public static ParseField SEARCH_TOTAL = new ParseField("search_total"); + public static ParseField PROCESSING_TOTAL = new ParseField("processing_total"); public static ParseField SEARCH_FAILURES = new ParseField("search_failures"); public static ParseField INDEX_FAILURES = new ParseField("index_failures"); @@ -44,11 +46,14 @@ public abstract class IndexerJobStats { protected final long indexTotal; protected final long searchTime; protected final long searchTotal; + protected final long processingTime; + protected final long processingTotal; protected final long indexFailures; protected final long searchFailures; public IndexerJobStats(long numPages, long numInputDocuments, long numOutputDocuments, long numInvocations, - long indexTime, long searchTime, long indexTotal, long searchTotal, long indexFailures, long searchFailures) { + long indexTime, long searchTime, long processingTime, long indexTotal, long searchTotal, long processingTotal, + long indexFailures, long searchFailures) { this.numPages = numPages; this.numInputDocuments = numInputDocuments; this.numOuputDocuments = numOutputDocuments; @@ -57,6 +62,8 @@ public IndexerJobStats(long numPages, long numInputDocuments, long numOutputDocu this.indexTotal = indexTotal; this.searchTime = searchTime; this.searchTotal = searchTotal; + this.processingTime = processingTime; + this.processingTotal = processingTotal; this.indexFailures = indexFailures; this.searchFailures = searchFailures; } @@ -117,6 +124,13 @@ public long getSearchTime() { return searchTime; } + /** + * Returns the time spent processing (cumulative) in milliseconds + */ + public long getProcessingTime() { + return processingTime; + } + /** * Returns the total number of indexing requests that have been processed * (Note: this is not the number of _documents_ that have been indexed) @@ -132,6 +146,14 @@ public long getSearchTotal() { return searchTotal; } + /** + * Returns the total number of processing runs that have been made + */ + public long getProcessingTotal() { + return processingTotal; + } + + @Override public boolean equals(Object other) { if (this == other) { @@ -149,16 +171,19 @@ public boolean equals(Object other) { && Objects.equals(this.numInvocations, that.numInvocations) && Objects.equals(this.indexTime, that.indexTime) && Objects.equals(this.searchTime, that.searchTime) + && Objects.equals(this.processingTime, that.processingTime) && Objects.equals(this.indexFailures, that.indexFailures) && Objects.equals(this.searchFailures, that.searchFailures) && Objects.equals(this.searchTotal, that.searchTotal) + && Objects.equals(this.processingTotal, that.processingTotal) && Objects.equals(this.indexTotal, that.indexTotal); } @Override public int hashCode() { return Objects.hash(numPages, numInputDocuments, numOuputDocuments, numInvocations, - indexTime, searchTime, indexFailures, searchFailures, searchTotal, indexTotal); + indexTime, searchTime, processingTime, indexFailures, searchFailures, searchTotal, + indexTotal, processingTotal); } @Override @@ -172,6 +197,8 @@ public final String toString() { + ", index_time_in_ms=" + indexTime + ", index_total=" + indexTotal + ", search_time_in_ms=" + searchTime - + ", search_total=" + searchTotal+ "}"; + + ", search_total=" + searchTotal + + ", processing_time_in_ms=" + processingTime + + ", processing_total=" + processingTotal + "}"; } } diff --git a/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/DataFrameAnalyticsStats.java b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/DataFrameAnalyticsStats.java index 53e3adf2b8433..acdb9cccca1eb 100644 --- a/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/DataFrameAnalyticsStats.java +++ b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/DataFrameAnalyticsStats.java @@ -20,12 +20,16 @@ package org.elasticsearch.client.ml.dataframe; import org.elasticsearch.client.ml.NodeAttributes; +import org.elasticsearch.client.ml.dataframe.stats.AnalysisStats; +import org.elasticsearch.client.ml.dataframe.stats.common.DataCounts; +import org.elasticsearch.client.ml.dataframe.stats.common.MemoryUsage; import org.elasticsearch.common.Nullable; import org.elasticsearch.common.ParseField; import org.elasticsearch.common.inject.internal.ToStringBuilder; import org.elasticsearch.common.xcontent.ConstructingObjectParser; import org.elasticsearch.common.xcontent.ObjectParser; import org.elasticsearch.common.xcontent.XContentParser; +import org.elasticsearch.common.xcontent.XContentParserUtils; import java.io.IOException; import java.util.List; @@ -44,7 +48,9 @@ public static DataFrameAnalyticsStats fromXContent(XContentParser parser) throws static final ParseField STATE = new ParseField("state"); static final ParseField FAILURE_REASON = new ParseField("failure_reason"); static final ParseField PROGRESS = new ParseField("progress"); + static final ParseField DATA_COUNTS = new ParseField("data_counts"); static final ParseField MEMORY_USAGE = new ParseField("memory_usage"); + static final ParseField ANALYSIS_STATS = new ParseField("analysis_stats"); static final ParseField NODE = new ParseField("node"); static final ParseField ASSIGNMENT_EXPLANATION = new ParseField("assignment_explanation"); @@ -56,9 +62,11 @@ public static DataFrameAnalyticsStats fromXContent(XContentParser parser) throws (DataFrameAnalyticsState) args[1], (String) args[2], (List) args[3], - (MemoryUsage) args[4], - (NodeAttributes) args[5], - (String) args[6])); + (DataCounts) args[4], + (MemoryUsage) args[5], + (AnalysisStats) args[6], + (NodeAttributes) args[7], + (String) args[8])); static { PARSER.declareString(constructorArg(), ID); @@ -70,27 +78,42 @@ public static DataFrameAnalyticsStats fromXContent(XContentParser parser) throws }, STATE, ObjectParser.ValueType.STRING); PARSER.declareString(optionalConstructorArg(), FAILURE_REASON); PARSER.declareObjectArray(optionalConstructorArg(), PhaseProgress.PARSER, PROGRESS); + PARSER.declareObject(optionalConstructorArg(), DataCounts.PARSER, DATA_COUNTS); PARSER.declareObject(optionalConstructorArg(), MemoryUsage.PARSER, MEMORY_USAGE); + PARSER.declareObject(optionalConstructorArg(), (p, c) -> parseAnalysisStats(p), ANALYSIS_STATS); PARSER.declareObject(optionalConstructorArg(), NodeAttributes.PARSER, NODE); PARSER.declareString(optionalConstructorArg(), ASSIGNMENT_EXPLANATION); } + private static AnalysisStats parseAnalysisStats(XContentParser parser) throws IOException { + XContentParserUtils.ensureExpectedToken(XContentParser.Token.START_OBJECT, parser.currentToken(), parser::getTokenLocation); + XContentParserUtils.ensureExpectedToken(XContentParser.Token.FIELD_NAME, parser.nextToken(), parser::getTokenLocation); + AnalysisStats analysisStats = parser.namedObject(AnalysisStats.class, parser.currentName(), true); + XContentParserUtils.ensureExpectedToken(XContentParser.Token.END_OBJECT, parser.nextToken(), parser::getTokenLocation); + return analysisStats; + } + private final String id; private final DataFrameAnalyticsState state; private final String failureReason; private final List progress; + private final DataCounts dataCounts; private final MemoryUsage memoryUsage; + private final AnalysisStats analysisStats; private final NodeAttributes node; private final String assignmentExplanation; public DataFrameAnalyticsStats(String id, DataFrameAnalyticsState state, @Nullable String failureReason, - @Nullable List progress, @Nullable MemoryUsage memoryUsage, - @Nullable NodeAttributes node, @Nullable String assignmentExplanation) { + @Nullable List progress, @Nullable DataCounts dataCounts, + @Nullable MemoryUsage memoryUsage, @Nullable AnalysisStats analysisStats, @Nullable NodeAttributes node, + @Nullable String assignmentExplanation) { this.id = id; this.state = state; this.failureReason = failureReason; this.progress = progress; + this.dataCounts = dataCounts; this.memoryUsage = memoryUsage; + this.analysisStats = analysisStats; this.node = node; this.assignmentExplanation = assignmentExplanation; } @@ -111,11 +134,21 @@ public List getProgress() { return progress; } + @Nullable + public DataCounts getDataCounts() { + return dataCounts; + } + @Nullable public MemoryUsage getMemoryUsage() { return memoryUsage; } + @Nullable + public AnalysisStats getAnalysisStats() { + return analysisStats; + } + public NodeAttributes getNode() { return node; } @@ -134,14 +167,16 @@ public boolean equals(Object o) { && Objects.equals(state, other.state) && Objects.equals(failureReason, other.failureReason) && Objects.equals(progress, other.progress) + && Objects.equals(dataCounts, other.dataCounts) && Objects.equals(memoryUsage, other.memoryUsage) + && Objects.equals(analysisStats, other.analysisStats) && Objects.equals(node, other.node) && Objects.equals(assignmentExplanation, other.assignmentExplanation); } @Override public int hashCode() { - return Objects.hash(id, state, failureReason, progress, memoryUsage, node, assignmentExplanation); + return Objects.hash(id, state, failureReason, progress, dataCounts, memoryUsage, analysisStats, node, assignmentExplanation); } @Override @@ -151,7 +186,9 @@ public String toString() { .add("state", state) .add("failureReason", failureReason) .add("progress", progress) + .add("dataCounts", dataCounts) .add("memoryUsage", memoryUsage) + .add("analysisStats", analysisStats) .add("node", node) .add("assignmentExplanation", assignmentExplanation) .toString(); diff --git a/modules/kibana/build.gradle b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/AnalysisStats.java similarity index 67% rename from modules/kibana/build.gradle rename to client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/AnalysisStats.java index f9d11e5a6c58b..c1a823682a762 100644 --- a/modules/kibana/build.gradle +++ b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/AnalysisStats.java @@ -7,7 +7,7 @@ * not use this file except in compliance with the License. * You may obtain a copy of the License at * - * http://www.apache.org/licenses/LICENSE-2.0 + * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, * software distributed under the License is distributed on an @@ -16,16 +16,14 @@ * specific language governing permissions and limitations * under the License. */ +package org.elasticsearch.client.ml.dataframe.stats; -esplugin { - description 'Plugin exposing APIs for Kibana system indices' - classname 'org.elasticsearch.kibana.KibanaPlugin' -} +import org.elasticsearch.common.xcontent.ToXContentObject; -dependencies { - compile project(path: ':modules:reindex', configuration: 'runtime') -} +/** + * Statistics for the data frame analysis + */ +public interface AnalysisStats extends ToXContentObject { -testClusters.integTest { - module file(project(':modules:reindex').tasks.bundlePlugin.archiveFile) + String getName(); } diff --git a/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/AnalysisStatsNamedXContentProvider.java b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/AnalysisStatsNamedXContentProvider.java new file mode 100644 index 0000000000000..8c9bc615e8653 --- /dev/null +++ b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/AnalysisStatsNamedXContentProvider.java @@ -0,0 +1,52 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.elasticsearch.client.ml.dataframe.stats; + +import org.elasticsearch.client.ml.dataframe.stats.classification.ClassificationStats; +import org.elasticsearch.client.ml.dataframe.stats.outlierdetection.OutlierDetectionStats; +import org.elasticsearch.client.ml.dataframe.stats.regression.RegressionStats; +import org.elasticsearch.common.xcontent.NamedXContentRegistry; +import org.elasticsearch.plugins.spi.NamedXContentProvider; + +import java.util.Arrays; +import java.util.List; + +public class AnalysisStatsNamedXContentProvider implements NamedXContentProvider { + + @Override + public List getNamedXContentParsers() { + return Arrays.asList( + new NamedXContentRegistry.Entry( + AnalysisStats.class, + ClassificationStats.NAME, + (p, c) -> ClassificationStats.PARSER.apply(p, null) + ), + new NamedXContentRegistry.Entry( + AnalysisStats.class, + OutlierDetectionStats.NAME, + (p, c) -> OutlierDetectionStats.PARSER.apply(p, null) + ), + new NamedXContentRegistry.Entry( + AnalysisStats.class, + RegressionStats.NAME, + (p, c) -> RegressionStats.PARSER.apply(p, null) + ) + ); + } +} diff --git a/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/classification/ClassificationStats.java b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/classification/ClassificationStats.java new file mode 100644 index 0000000000000..101f74f2fe239 --- /dev/null +++ b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/classification/ClassificationStats.java @@ -0,0 +1,135 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.elasticsearch.client.ml.dataframe.stats.classification; + +import org.elasticsearch.client.common.TimeUtil; +import org.elasticsearch.client.ml.dataframe.stats.AnalysisStats; +import org.elasticsearch.common.ParseField; +import org.elasticsearch.common.xcontent.ConstructingObjectParser; +import org.elasticsearch.common.xcontent.ObjectParser; +import org.elasticsearch.common.xcontent.ToXContent; +import org.elasticsearch.common.xcontent.XContentBuilder; + +import java.io.IOException; +import java.time.Instant; +import java.util.Objects; + +public class ClassificationStats implements AnalysisStats { + + public static final ParseField NAME = new ParseField("classification_stats"); + + public static final ParseField TIMESTAMP = new ParseField("timestamp"); + public static final ParseField ITERATION = new ParseField("iteration"); + public static final ParseField HYPERPARAMETERS = new ParseField("hyperparameters"); + public static final ParseField TIMING_STATS = new ParseField("timing_stats"); + public static final ParseField VALIDATION_LOSS = new ParseField("validation_loss"); + + public static final ConstructingObjectParser PARSER = new ConstructingObjectParser<>(NAME.getPreferredName(), + true, + a -> new ClassificationStats( + (Instant) a[0], + (Integer) a[1], + (Hyperparameters) a[2], + (TimingStats) a[3], + (ValidationLoss) a[4] + ) + ); + + static { + PARSER.declareField(ConstructingObjectParser.constructorArg(), + p -> TimeUtil.parseTimeFieldToInstant(p, TIMESTAMP.getPreferredName()), + TIMESTAMP, + ObjectParser.ValueType.VALUE); + PARSER.declareInt(ConstructingObjectParser.optionalConstructorArg(), ITERATION); + PARSER.declareObject(ConstructingObjectParser.constructorArg(), Hyperparameters.PARSER, HYPERPARAMETERS); + PARSER.declareObject(ConstructingObjectParser.constructorArg(), TimingStats.PARSER, TIMING_STATS); + PARSER.declareObject(ConstructingObjectParser.constructorArg(), ValidationLoss.PARSER, VALIDATION_LOSS); + } + + private final Instant timestamp; + private final Integer iteration; + private final Hyperparameters hyperparameters; + private final TimingStats timingStats; + private final ValidationLoss validationLoss; + + public ClassificationStats(Instant timestamp, Integer iteration, Hyperparameters hyperparameters, TimingStats timingStats, + ValidationLoss validationLoss) { + this.timestamp = Instant.ofEpochMilli(Objects.requireNonNull(timestamp).toEpochMilli()); + this.iteration = iteration; + this.hyperparameters = Objects.requireNonNull(hyperparameters); + this.timingStats = Objects.requireNonNull(timingStats); + this.validationLoss = Objects.requireNonNull(validationLoss); + } + + public Instant getTimestamp() { + return timestamp; + } + + public Integer getIteration() { + return iteration; + } + + public Hyperparameters getHyperparameters() { + return hyperparameters; + } + + public TimingStats getTimingStats() { + return timingStats; + } + + public ValidationLoss getValidationLoss() { + return validationLoss; + } + + @Override + public XContentBuilder toXContent(XContentBuilder builder, ToXContent.Params params) throws IOException { + builder.startObject(); + builder.timeField(TIMESTAMP.getPreferredName(), TIMESTAMP.getPreferredName() + "_string", timestamp.toEpochMilli()); + if (iteration != null) { + builder.field(ITERATION.getPreferredName(), iteration); + } + builder.field(HYPERPARAMETERS.getPreferredName(), hyperparameters); + builder.field(TIMING_STATS.getPreferredName(), timingStats); + builder.field(VALIDATION_LOSS.getPreferredName(), validationLoss); + builder.endObject(); + return builder; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + ClassificationStats that = (ClassificationStats) o; + return Objects.equals(timestamp, that.timestamp) + && Objects.equals(iteration, that.iteration) + && Objects.equals(hyperparameters, that.hyperparameters) + && Objects.equals(timingStats, that.timingStats) + && Objects.equals(validationLoss, that.validationLoss); + } + + @Override + public int hashCode() { + return Objects.hash(timestamp, iteration, hyperparameters, timingStats, validationLoss); + } + + @Override + public String getName() { + return NAME.getPreferredName(); + } +} diff --git a/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/classification/Hyperparameters.java b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/classification/Hyperparameters.java new file mode 100644 index 0000000000000..c8d581b1d9c41 --- /dev/null +++ b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/classification/Hyperparameters.java @@ -0,0 +1,293 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.elasticsearch.client.ml.dataframe.stats.classification; + +import org.elasticsearch.common.ParseField; +import org.elasticsearch.common.xcontent.ConstructingObjectParser; +import org.elasticsearch.common.xcontent.ToXContentObject; +import org.elasticsearch.common.xcontent.XContentBuilder; + +import java.io.IOException; +import java.util.Objects; + +import static org.elasticsearch.common.xcontent.ConstructingObjectParser.optionalConstructorArg; + +public class Hyperparameters implements ToXContentObject { + + public static final ParseField CLASS_ASSIGNMENT_OBJECTIVE = new ParseField("class_assignment_objective"); + public static final ParseField DOWNSAMPLE_FACTOR = new ParseField("downsample_factor"); + public static final ParseField ETA = new ParseField("eta"); + public static final ParseField ETA_GROWTH_RATE_PER_TREE = new ParseField("eta_growth_rate_per_tree"); + public static final ParseField FEATURE_BAG_FRACTION = new ParseField("feature_bag_fraction"); + public static final ParseField MAX_ATTEMPTS_TO_ADD_TREE = new ParseField("max_attempts_to_add_tree"); + public static final ParseField MAX_OPTIMIZATION_ROUNDS_PER_HYPERPARAMETER = new ParseField( + "max_optimization_rounds_per_hyperparameter"); + public static final ParseField MAX_TREES = new ParseField("max_trees"); + public static final ParseField NUM_FOLDS = new ParseField("num_folds"); + public static final ParseField NUM_SPLITS_PER_FEATURE = new ParseField("num_splits_per_feature"); + public static final ParseField REGULARIZATION_DEPTH_PENALTY_MULTIPLIER = new ParseField("regularization_depth_penalty_multiplier"); + public static final ParseField REGULARIZATION_LEAF_WEIGHT_PENALTY_MULTIPLIER + = new ParseField("regularization_leaf_weight_penalty_multiplier"); + public static final ParseField REGULARIZATION_SOFT_TREE_DEPTH_LIMIT = new ParseField("regularization_soft_tree_depth_limit"); + public static final ParseField REGULARIZATION_SOFT_TREE_DEPTH_TOLERANCE = new ParseField("regularization_soft_tree_depth_tolerance"); + public static final ParseField REGULARIZATION_TREE_SIZE_PENALTY_MULTIPLIER = + new ParseField("regularization_tree_size_penalty_multiplier"); + + public static ConstructingObjectParser PARSER = new ConstructingObjectParser<>("classification_hyperparameters", + true, + a -> new Hyperparameters( + (String) a[0], + (Double) a[1], + (Double) a[2], + (Double) a[3], + (Double) a[4], + (Integer) a[5], + (Integer) a[6], + (Integer) a[7], + (Integer) a[8], + (Integer) a[9], + (Double) a[10], + (Double) a[11], + (Double) a[12], + (Double) a[13], + (Double) a[14] + )); + + static { + PARSER.declareString(optionalConstructorArg(), CLASS_ASSIGNMENT_OBJECTIVE); + PARSER.declareDouble(optionalConstructorArg(), DOWNSAMPLE_FACTOR); + PARSER.declareDouble(optionalConstructorArg(), ETA); + PARSER.declareDouble(optionalConstructorArg(), ETA_GROWTH_RATE_PER_TREE); + PARSER.declareDouble(optionalConstructorArg(), FEATURE_BAG_FRACTION); + PARSER.declareInt(optionalConstructorArg(), MAX_ATTEMPTS_TO_ADD_TREE); + PARSER.declareInt(optionalConstructorArg(), MAX_OPTIMIZATION_ROUNDS_PER_HYPERPARAMETER); + PARSER.declareInt(optionalConstructorArg(), MAX_TREES); + PARSER.declareInt(optionalConstructorArg(), NUM_FOLDS); + PARSER.declareInt(optionalConstructorArg(), NUM_SPLITS_PER_FEATURE); + PARSER.declareDouble(optionalConstructorArg(), REGULARIZATION_DEPTH_PENALTY_MULTIPLIER); + PARSER.declareDouble(optionalConstructorArg(), REGULARIZATION_LEAF_WEIGHT_PENALTY_MULTIPLIER); + PARSER.declareDouble(optionalConstructorArg(), REGULARIZATION_SOFT_TREE_DEPTH_LIMIT); + PARSER.declareDouble(optionalConstructorArg(), REGULARIZATION_SOFT_TREE_DEPTH_TOLERANCE); + PARSER.declareDouble(optionalConstructorArg(), REGULARIZATION_TREE_SIZE_PENALTY_MULTIPLIER); + } + + private final String classAssignmentObjective; + private final Double downsampleFactor; + private final Double eta; + private final Double etaGrowthRatePerTree; + private final Double featureBagFraction; + private final Integer maxAttemptsToAddTree; + private final Integer maxOptimizationRoundsPerHyperparameter; + private final Integer maxTrees; + private final Integer numFolds; + private final Integer numSplitsPerFeature; + private final Double regularizationDepthPenaltyMultiplier; + private final Double regularizationLeafWeightPenaltyMultiplier; + private final Double regularizationSoftTreeDepthLimit; + private final Double regularizationSoftTreeDepthTolerance; + private final Double regularizationTreeSizePenaltyMultiplier; + + public Hyperparameters(String classAssignmentObjective, + Double downsampleFactor, + Double eta, + Double etaGrowthRatePerTree, + Double featureBagFraction, + Integer maxAttemptsToAddTree, + Integer maxOptimizationRoundsPerHyperparameter, + Integer maxTrees, + Integer numFolds, + Integer numSplitsPerFeature, + Double regularizationDepthPenaltyMultiplier, + Double regularizationLeafWeightPenaltyMultiplier, + Double regularizationSoftTreeDepthLimit, + Double regularizationSoftTreeDepthTolerance, + Double regularizationTreeSizePenaltyMultiplier) { + this.classAssignmentObjective = classAssignmentObjective; + this.downsampleFactor = downsampleFactor; + this.eta = eta; + this.etaGrowthRatePerTree = etaGrowthRatePerTree; + this.featureBagFraction = featureBagFraction; + this.maxAttemptsToAddTree = maxAttemptsToAddTree; + this.maxOptimizationRoundsPerHyperparameter = maxOptimizationRoundsPerHyperparameter; + this.maxTrees = maxTrees; + this.numFolds = numFolds; + this.numSplitsPerFeature = numSplitsPerFeature; + this.regularizationDepthPenaltyMultiplier = regularizationDepthPenaltyMultiplier; + this.regularizationLeafWeightPenaltyMultiplier = regularizationLeafWeightPenaltyMultiplier; + this.regularizationSoftTreeDepthLimit = regularizationSoftTreeDepthLimit; + this.regularizationSoftTreeDepthTolerance = regularizationSoftTreeDepthTolerance; + this.regularizationTreeSizePenaltyMultiplier = regularizationTreeSizePenaltyMultiplier; + } + + public String getClassAssignmentObjective() { + return classAssignmentObjective; + } + + public Double getDownsampleFactor() { + return downsampleFactor; + } + + public Double getEta() { + return eta; + } + + public Double getEtaGrowthRatePerTree() { + return etaGrowthRatePerTree; + } + + public Double getFeatureBagFraction() { + return featureBagFraction; + } + + public Integer getMaxAttemptsToAddTree() { + return maxAttemptsToAddTree; + } + + public Integer getMaxOptimizationRoundsPerHyperparameter() { + return maxOptimizationRoundsPerHyperparameter; + } + + public Integer getMaxTrees() { + return maxTrees; + } + + public Integer getNumFolds() { + return numFolds; + } + + public Integer getNumSplitsPerFeature() { + return numSplitsPerFeature; + } + + public Double getRegularizationDepthPenaltyMultiplier() { + return regularizationDepthPenaltyMultiplier; + } + + public Double getRegularizationLeafWeightPenaltyMultiplier() { + return regularizationLeafWeightPenaltyMultiplier; + } + + public Double getRegularizationSoftTreeDepthLimit() { + return regularizationSoftTreeDepthLimit; + } + + public Double getRegularizationSoftTreeDepthTolerance() { + return regularizationSoftTreeDepthTolerance; + } + + public Double getRegularizationTreeSizePenaltyMultiplier() { + return regularizationTreeSizePenaltyMultiplier; + } + + @Override + public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { + builder.startObject(); + if (classAssignmentObjective != null) { + builder.field(CLASS_ASSIGNMENT_OBJECTIVE.getPreferredName(), classAssignmentObjective); + } + if (downsampleFactor != null) { + builder.field(DOWNSAMPLE_FACTOR.getPreferredName(), downsampleFactor); + } + if (eta != null) { + builder.field(ETA.getPreferredName(), eta); + } + if (etaGrowthRatePerTree != null) { + builder.field(ETA_GROWTH_RATE_PER_TREE.getPreferredName(), etaGrowthRatePerTree); + } + if (featureBagFraction != null) { + builder.field(FEATURE_BAG_FRACTION.getPreferredName(), featureBagFraction); + } + if (maxAttemptsToAddTree != null) { + builder.field(MAX_ATTEMPTS_TO_ADD_TREE.getPreferredName(), maxAttemptsToAddTree); + } + if (maxOptimizationRoundsPerHyperparameter != null) { + builder.field(MAX_OPTIMIZATION_ROUNDS_PER_HYPERPARAMETER.getPreferredName(), maxOptimizationRoundsPerHyperparameter); + } + if (maxTrees != null) { + builder.field(MAX_TREES.getPreferredName(), maxTrees); + } + if (numFolds != null) { + builder.field(NUM_FOLDS.getPreferredName(), numFolds); + } + if (numSplitsPerFeature != null) { + builder.field(NUM_SPLITS_PER_FEATURE.getPreferredName(), numSplitsPerFeature); + } + if (regularizationDepthPenaltyMultiplier != null) { + builder.field(REGULARIZATION_DEPTH_PENALTY_MULTIPLIER.getPreferredName(), regularizationDepthPenaltyMultiplier); + } + if (regularizationLeafWeightPenaltyMultiplier != null) { + builder.field(REGULARIZATION_LEAF_WEIGHT_PENALTY_MULTIPLIER.getPreferredName(), regularizationLeafWeightPenaltyMultiplier); + } + if (regularizationSoftTreeDepthLimit != null) { + builder.field(REGULARIZATION_SOFT_TREE_DEPTH_LIMIT.getPreferredName(), regularizationSoftTreeDepthLimit); + } + if (regularizationSoftTreeDepthTolerance != null) { + builder.field(REGULARIZATION_SOFT_TREE_DEPTH_TOLERANCE.getPreferredName(), regularizationSoftTreeDepthTolerance); + } + if (regularizationTreeSizePenaltyMultiplier != null) { + builder.field(REGULARIZATION_TREE_SIZE_PENALTY_MULTIPLIER.getPreferredName(), regularizationTreeSizePenaltyMultiplier); + } + builder.endObject(); + return builder; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + + Hyperparameters that = (Hyperparameters) o; + return Objects.equals(classAssignmentObjective, that.classAssignmentObjective) + && Objects.equals(downsampleFactor, that.downsampleFactor) + && Objects.equals(eta, that.eta) + && Objects.equals(etaGrowthRatePerTree, that.etaGrowthRatePerTree) + && Objects.equals(featureBagFraction, that.featureBagFraction) + && Objects.equals(maxAttemptsToAddTree, that.maxAttemptsToAddTree) + && Objects.equals(maxOptimizationRoundsPerHyperparameter, that.maxOptimizationRoundsPerHyperparameter) + && Objects.equals(maxTrees, that.maxTrees) + && Objects.equals(numFolds, that.numFolds) + && Objects.equals(numSplitsPerFeature, that.numSplitsPerFeature) + && Objects.equals(regularizationDepthPenaltyMultiplier, that.regularizationDepthPenaltyMultiplier) + && Objects.equals(regularizationLeafWeightPenaltyMultiplier, that.regularizationLeafWeightPenaltyMultiplier) + && Objects.equals(regularizationSoftTreeDepthLimit, that.regularizationSoftTreeDepthLimit) + && Objects.equals(regularizationSoftTreeDepthTolerance, that.regularizationSoftTreeDepthTolerance) + && Objects.equals(regularizationTreeSizePenaltyMultiplier, that.regularizationTreeSizePenaltyMultiplier); + } + + @Override + public int hashCode() { + return Objects.hash( + classAssignmentObjective, + downsampleFactor, + eta, + etaGrowthRatePerTree, + featureBagFraction, + maxAttemptsToAddTree, + maxOptimizationRoundsPerHyperparameter, + maxTrees, + numFolds, + numSplitsPerFeature, + regularizationDepthPenaltyMultiplier, + regularizationLeafWeightPenaltyMultiplier, + regularizationSoftTreeDepthLimit, + regularizationSoftTreeDepthTolerance, + regularizationTreeSizePenaltyMultiplier + ); + } +} diff --git a/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/classification/TimingStats.java b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/classification/TimingStats.java new file mode 100644 index 0000000000000..bad599298a780 --- /dev/null +++ b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/classification/TimingStats.java @@ -0,0 +1,87 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.elasticsearch.client.ml.dataframe.stats.classification; + +import org.elasticsearch.common.ParseField; +import org.elasticsearch.common.unit.TimeValue; +import org.elasticsearch.common.xcontent.ConstructingObjectParser; +import org.elasticsearch.common.xcontent.ToXContentObject; +import org.elasticsearch.common.xcontent.XContentBuilder; + +import java.io.IOException; +import java.util.Objects; + +public class TimingStats implements ToXContentObject { + + public static final ParseField ELAPSED_TIME = new ParseField("elapsed_time"); + public static final ParseField ITERATION_TIME = new ParseField("iteration_time"); + + public static ConstructingObjectParser PARSER = new ConstructingObjectParser<>("classification_timing_stats", true, + a -> new TimingStats( + a[0] == null ? null : TimeValue.timeValueMillis((long) a[0]), + a[1] == null ? null : TimeValue.timeValueMillis((long) a[1]) + )); + + static { + PARSER.declareLong(ConstructingObjectParser.optionalConstructorArg(), ELAPSED_TIME); + PARSER.declareLong(ConstructingObjectParser.optionalConstructorArg(), ITERATION_TIME); + } + + private final TimeValue elapsedTime; + private final TimeValue iterationTime; + + public TimingStats(TimeValue elapsedTime, TimeValue iterationTime) { + this.elapsedTime = elapsedTime; + this.iterationTime = iterationTime; + } + + public TimeValue getElapsedTime() { + return elapsedTime; + } + + public TimeValue getIterationTime() { + return iterationTime; + } + + @Override + public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { + builder.startObject(); + if (elapsedTime != null) { + builder.humanReadableField(ELAPSED_TIME.getPreferredName(), ELAPSED_TIME.getPreferredName() + "_string", elapsedTime); + } + if (iterationTime != null) { + builder.humanReadableField(ITERATION_TIME.getPreferredName(), ITERATION_TIME.getPreferredName() + "_string", iterationTime); + } + builder.endObject(); + return builder; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + TimingStats that = (TimingStats) o; + return Objects.equals(elapsedTime, that.elapsedTime) && Objects.equals(iterationTime, that.iterationTime); + } + + @Override + public int hashCode() { + return Objects.hash(elapsedTime, iterationTime); + } +} diff --git a/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/classification/ValidationLoss.java b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/classification/ValidationLoss.java new file mode 100644 index 0000000000000..a552f5d85e124 --- /dev/null +++ b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/classification/ValidationLoss.java @@ -0,0 +1,87 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.elasticsearch.client.ml.dataframe.stats.classification; + +import org.elasticsearch.client.ml.dataframe.stats.common.FoldValues; +import org.elasticsearch.common.ParseField; +import org.elasticsearch.common.xcontent.ConstructingObjectParser; +import org.elasticsearch.common.xcontent.ToXContentObject; +import org.elasticsearch.common.xcontent.XContentBuilder; + +import java.io.IOException; +import java.util.List; +import java.util.Objects; + +public class ValidationLoss implements ToXContentObject { + + public static final ParseField LOSS_TYPE = new ParseField("loss_type"); + public static final ParseField FOLD_VALUES = new ParseField("fold_values"); + + @SuppressWarnings("unchecked") + public static ConstructingObjectParser PARSER = new ConstructingObjectParser<>("classification_validation_loss", + true, + a -> new ValidationLoss((String) a[0], (List) a[1])); + + static { + PARSER.declareString(ConstructingObjectParser.optionalConstructorArg(), LOSS_TYPE); + PARSER.declareObjectArray(ConstructingObjectParser.optionalConstructorArg(), FoldValues.PARSER, FOLD_VALUES); + } + + private final String lossType; + private final List foldValues; + + public ValidationLoss(String lossType, List values) { + this.lossType = lossType; + this.foldValues = values; + } + + public String getLossType() { + return lossType; + } + + public List getFoldValues() { + return foldValues; + } + + @Override + public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { + builder.startObject(); + if (lossType != null) { + builder.field(LOSS_TYPE.getPreferredName(), lossType); + } + if (foldValues != null) { + builder.field(FOLD_VALUES.getPreferredName(), foldValues); + } + builder.endObject(); + return builder; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + ValidationLoss that = (ValidationLoss) o; + return Objects.equals(lossType, that.lossType) && Objects.equals(foldValues, that.foldValues); + } + + @Override + public int hashCode() { + return Objects.hash(lossType, foldValues); + } +} diff --git a/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/common/DataCounts.java b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/common/DataCounts.java new file mode 100644 index 0000000000000..b7a90b1f0b5c6 --- /dev/null +++ b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/common/DataCounts.java @@ -0,0 +1,119 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ + +package org.elasticsearch.client.ml.dataframe.stats.common; + +import org.elasticsearch.common.Nullable; +import org.elasticsearch.common.ParseField; +import org.elasticsearch.common.inject.internal.ToStringBuilder; +import org.elasticsearch.common.xcontent.ConstructingObjectParser; +import org.elasticsearch.common.xcontent.ToXContentObject; +import org.elasticsearch.common.xcontent.XContentBuilder; + +import java.io.IOException; +import java.util.Objects; + +import static org.elasticsearch.common.xcontent.ConstructingObjectParser.optionalConstructorArg; + +public class DataCounts implements ToXContentObject { + + public static final String TYPE_VALUE = "analytics_data_counts"; + + public static final ParseField TRAINING_DOCS_COUNT = new ParseField("training_docs_count"); + public static final ParseField TEST_DOCS_COUNT = new ParseField("test_docs_count"); + public static final ParseField SKIPPED_DOCS_COUNT = new ParseField("skipped_docs_count"); + + public static final ConstructingObjectParser PARSER = new ConstructingObjectParser<>(TYPE_VALUE, true, + a -> { + Long trainingDocsCount = (Long) a[0]; + Long testDocsCount = (Long) a[1]; + Long skippedDocsCount = (Long) a[2]; + return new DataCounts( + getOrDefault(trainingDocsCount, 0L), + getOrDefault(testDocsCount, 0L), + getOrDefault(skippedDocsCount, 0L) + ); + }); + + static { + PARSER.declareLong(optionalConstructorArg(), TRAINING_DOCS_COUNT); + PARSER.declareLong(optionalConstructorArg(), TEST_DOCS_COUNT); + PARSER.declareLong(optionalConstructorArg(), SKIPPED_DOCS_COUNT); + } + + private final long trainingDocsCount; + private final long testDocsCount; + private final long skippedDocsCount; + + public DataCounts(long trainingDocsCount, long testDocsCount, long skippedDocsCount) { + this.trainingDocsCount = trainingDocsCount; + this.testDocsCount = testDocsCount; + this.skippedDocsCount = skippedDocsCount; + } + + @Override + public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { + builder.startObject(); + builder.field(TRAINING_DOCS_COUNT.getPreferredName(), trainingDocsCount); + builder.field(TEST_DOCS_COUNT.getPreferredName(), testDocsCount); + builder.field(SKIPPED_DOCS_COUNT.getPreferredName(), skippedDocsCount); + builder.endObject(); + return builder; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + DataCounts that = (DataCounts) o; + return trainingDocsCount == that.trainingDocsCount + && testDocsCount == that.testDocsCount + && skippedDocsCount == that.skippedDocsCount; + } + + @Override + public int hashCode() { + return Objects.hash(trainingDocsCount, testDocsCount, skippedDocsCount); + } + + @Override + public String toString() { + return new ToStringBuilder(getClass()) + .add(TRAINING_DOCS_COUNT.getPreferredName(), trainingDocsCount) + .add(TEST_DOCS_COUNT.getPreferredName(), testDocsCount) + .add(SKIPPED_DOCS_COUNT.getPreferredName(), skippedDocsCount) + .toString(); + } + + public long getTrainingDocsCount() { + return trainingDocsCount; + } + + public long getTestDocsCount() { + return testDocsCount; + } + + public long getSkippedDocsCount() { + return skippedDocsCount; + } + + private static T getOrDefault(@Nullable T value, T defaultValue) { + return value != null ? value : defaultValue; + } +} diff --git a/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/common/FoldValues.java b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/common/FoldValues.java new file mode 100644 index 0000000000000..30490981d9651 --- /dev/null +++ b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/common/FoldValues.java @@ -0,0 +1,87 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.elasticsearch.client.ml.dataframe.stats.common; + +import org.elasticsearch.common.ParseField; +import org.elasticsearch.common.xcontent.ConstructingObjectParser; +import org.elasticsearch.common.xcontent.ToXContentObject; +import org.elasticsearch.common.xcontent.XContentBuilder; + +import java.io.IOException; +import java.util.Arrays; +import java.util.List; +import java.util.Objects; + +public class FoldValues implements ToXContentObject { + + public static final ParseField FOLD = new ParseField("fold"); + public static final ParseField VALUES = new ParseField("values"); + + @SuppressWarnings("unchecked") + public static ConstructingObjectParser PARSER = new ConstructingObjectParser<>("fold_values", true, + a -> new FoldValues((int) a[0], (List) a[1])); + + static { + PARSER.declareInt(ConstructingObjectParser.constructorArg(), FOLD); + PARSER.declareDoubleArray(ConstructingObjectParser.constructorArg(), VALUES); + } + + private final int fold; + private final double[] values; + + private FoldValues(int fold, List values) { + this(fold, values.stream().mapToDouble(Double::doubleValue).toArray()); + } + + public FoldValues(int fold, double[] values) { + this.fold = fold; + this.values = values; + } + + public int getFold() { + return fold; + } + + public double[] getValues() { + return values; + } + + @Override + public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { + builder.startObject(); + builder.field(FOLD.getPreferredName(), fold); + builder.array(VALUES.getPreferredName(), values); + builder.endObject(); + return builder; + } + + @Override + public boolean equals(Object o) { + if (o == this) return true; + if (o == null || getClass() != o.getClass()) return false; + + FoldValues other = (FoldValues) o; + return fold == other.fold && Arrays.equals(values, other.values); + } + + @Override + public int hashCode() { + return Objects.hash(fold, Arrays.hashCode(values)); + } +} diff --git a/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/MemoryUsage.java b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/common/MemoryUsage.java similarity index 94% rename from client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/MemoryUsage.java rename to client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/common/MemoryUsage.java index 323ebb52a7aed..f492d26528e02 100644 --- a/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/MemoryUsage.java +++ b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/common/MemoryUsage.java @@ -16,7 +16,7 @@ * specific language governing permissions and limitations * under the License. */ -package org.elasticsearch.client.ml.dataframe; +package org.elasticsearch.client.ml.dataframe.stats.common; import org.elasticsearch.client.common.TimeUtil; import org.elasticsearch.common.ParseField; @@ -54,6 +54,14 @@ public MemoryUsage(Instant timestamp, long peakUsageBytes) { this.peakUsageBytes = peakUsageBytes; } + public Instant getTimestamp() { + return timestamp; + } + + public long getPeakUsageBytes() { + return peakUsageBytes; + } + @Override public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { builder.startObject(); diff --git a/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/outlierdetection/OutlierDetectionStats.java b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/outlierdetection/OutlierDetectionStats.java new file mode 100644 index 0000000000000..e3236dad0cd26 --- /dev/null +++ b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/outlierdetection/OutlierDetectionStats.java @@ -0,0 +1,105 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.elasticsearch.client.ml.dataframe.stats.outlierdetection; + +import org.elasticsearch.client.common.TimeUtil; +import org.elasticsearch.client.ml.dataframe.stats.AnalysisStats; +import org.elasticsearch.common.ParseField; +import org.elasticsearch.common.xcontent.ConstructingObjectParser; +import org.elasticsearch.common.xcontent.ObjectParser; +import org.elasticsearch.common.xcontent.ToXContent; +import org.elasticsearch.common.xcontent.XContentBuilder; + +import java.io.IOException; +import java.time.Instant; +import java.util.Objects; + +public class OutlierDetectionStats implements AnalysisStats { + + public static final ParseField NAME = new ParseField("outlier_detection_stats"); + + public static final ParseField TIMESTAMP = new ParseField("timestamp"); + public static final ParseField PARAMETERS = new ParseField("parameters"); + public static final ParseField TIMING_STATS = new ParseField("timing_stats"); + + public static final ConstructingObjectParser PARSER = new ConstructingObjectParser<>( + NAME.getPreferredName(), true, + a -> new OutlierDetectionStats((Instant) a[0], (Parameters) a[1], (TimingStats) a[2])); + + static { + PARSER.declareField(ConstructingObjectParser.constructorArg(), + p -> TimeUtil.parseTimeFieldToInstant(p, TIMESTAMP.getPreferredName()), + TIMESTAMP, + ObjectParser.ValueType.VALUE); + PARSER.declareObject(ConstructingObjectParser.constructorArg(), Parameters.PARSER, PARAMETERS); + PARSER.declareObject(ConstructingObjectParser.constructorArg(), TimingStats.PARSER, TIMING_STATS); + } + + private final Instant timestamp; + private final Parameters parameters; + private final TimingStats timingStats; + + public OutlierDetectionStats(Instant timestamp, Parameters parameters, TimingStats timingStats) { + this.timestamp = Instant.ofEpochMilli(Objects.requireNonNull(timestamp).toEpochMilli()); + this.parameters = Objects.requireNonNull(parameters); + this.timingStats = Objects.requireNonNull(timingStats); + } + + public Instant getTimestamp() { + return timestamp; + } + + public Parameters getParameters() { + return parameters; + } + + public TimingStats getTimingStats() { + return timingStats; + } + + @Override + public XContentBuilder toXContent(XContentBuilder builder, ToXContent.Params params) throws IOException { + builder.startObject(); + builder.timeField(TIMESTAMP.getPreferredName(), TIMESTAMP.getPreferredName() + "_string", timestamp.toEpochMilli()); + builder.field(PARAMETERS.getPreferredName(), parameters); + builder.field(TIMING_STATS.getPreferredName(), timingStats); + builder.endObject(); + return builder; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + OutlierDetectionStats that = (OutlierDetectionStats) o; + return Objects.equals(timestamp, that.timestamp) + && Objects.equals(parameters, that.parameters) + && Objects.equals(timingStats, that.timingStats); + } + + @Override + public int hashCode() { + return Objects.hash(timestamp, parameters, timingStats); + } + + @Override + public String getName() { + return NAME.getPreferredName(); + } +} diff --git a/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/outlierdetection/Parameters.java b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/outlierdetection/Parameters.java new file mode 100644 index 0000000000000..deafb55081de0 --- /dev/null +++ b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/outlierdetection/Parameters.java @@ -0,0 +1,146 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.elasticsearch.client.ml.dataframe.stats.outlierdetection; + +import org.elasticsearch.common.ParseField; +import org.elasticsearch.common.xcontent.ConstructingObjectParser; +import org.elasticsearch.common.xcontent.ToXContentObject; +import org.elasticsearch.common.xcontent.XContentBuilder; + +import java.io.IOException; +import java.util.Objects; + +import static org.elasticsearch.common.xcontent.ConstructingObjectParser.optionalConstructorArg; + +public class Parameters implements ToXContentObject { + + public static final ParseField N_NEIGHBORS = new ParseField("n_neighbors"); + public static final ParseField METHOD = new ParseField("method"); + public static final ParseField FEATURE_INFLUENCE_THRESHOLD = new ParseField("feature_influence_threshold"); + public static final ParseField COMPUTE_FEATURE_INFLUENCE = new ParseField("compute_feature_influence"); + public static final ParseField OUTLIER_FRACTION = new ParseField("outlier_fraction"); + public static final ParseField STANDARDIZATION_ENABLED = new ParseField("standardization_enabled"); + + @SuppressWarnings("unchecked") + public static ConstructingObjectParser PARSER = new ConstructingObjectParser<>("outlier_detection_parameters", + true, + a -> new Parameters( + (Integer) a[0], + (String) a[1], + (Boolean) a[2], + (Double) a[3], + (Double) a[4], + (Boolean) a[5] + )); + + static { + PARSER.declareInt(optionalConstructorArg(), N_NEIGHBORS); + PARSER.declareString(optionalConstructorArg(), METHOD); + PARSER.declareBoolean(optionalConstructorArg(), COMPUTE_FEATURE_INFLUENCE); + PARSER.declareDouble(optionalConstructorArg(), FEATURE_INFLUENCE_THRESHOLD); + PARSER.declareDouble(optionalConstructorArg(), OUTLIER_FRACTION); + PARSER.declareBoolean(optionalConstructorArg(), STANDARDIZATION_ENABLED); + } + + private final Integer nNeighbors; + private final String method; + private final Boolean computeFeatureInfluence; + private final Double featureInfluenceThreshold; + private final Double outlierFraction; + private final Boolean standardizationEnabled; + + public Parameters(Integer nNeighbors, String method, Boolean computeFeatureInfluence, Double featureInfluenceThreshold, + Double outlierFraction, Boolean standardizationEnabled) { + this.nNeighbors = nNeighbors; + this.method = method; + this.computeFeatureInfluence = computeFeatureInfluence; + this.featureInfluenceThreshold = featureInfluenceThreshold; + this.outlierFraction = outlierFraction; + this.standardizationEnabled = standardizationEnabled; + } + + public Integer getnNeighbors() { + return nNeighbors; + } + + public String getMethod() { + return method; + } + + public Boolean getComputeFeatureInfluence() { + return computeFeatureInfluence; + } + + public Double getFeatureInfluenceThreshold() { + return featureInfluenceThreshold; + } + + public Double getOutlierFraction() { + return outlierFraction; + } + + public Boolean getStandardizationEnabled() { + return standardizationEnabled; + } + + @Override + public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { + builder.startObject(); + if (nNeighbors != null) { + builder.field(N_NEIGHBORS.getPreferredName(), nNeighbors); + } + if (method != null) { + builder.field(METHOD.getPreferredName(), method); + } + if (computeFeatureInfluence != null) { + builder.field(COMPUTE_FEATURE_INFLUENCE.getPreferredName(), computeFeatureInfluence); + } + if (featureInfluenceThreshold != null) { + builder.field(FEATURE_INFLUENCE_THRESHOLD.getPreferredName(), featureInfluenceThreshold); + } + if (outlierFraction != null) { + builder.field(OUTLIER_FRACTION.getPreferredName(), outlierFraction); + } + if (standardizationEnabled != null) { + builder.field(STANDARDIZATION_ENABLED.getPreferredName(), standardizationEnabled); + } + builder.endObject(); + return builder; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + + Parameters that = (Parameters) o; + return Objects.equals(nNeighbors, that.nNeighbors) + && Objects.equals(method, that.method) + && Objects.equals(computeFeatureInfluence, that.computeFeatureInfluence) + && Objects.equals(featureInfluenceThreshold, that.featureInfluenceThreshold) + && Objects.equals(outlierFraction, that.outlierFraction) + && Objects.equals(standardizationEnabled, that.standardizationEnabled); + } + + @Override + public int hashCode() { + return Objects.hash(nNeighbors, method, computeFeatureInfluence, featureInfluenceThreshold, outlierFraction, + standardizationEnabled); + } +} diff --git a/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/outlierdetection/TimingStats.java b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/outlierdetection/TimingStats.java new file mode 100644 index 0000000000000..96f93a6651de7 --- /dev/null +++ b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/outlierdetection/TimingStats.java @@ -0,0 +1,74 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.elasticsearch.client.ml.dataframe.stats.outlierdetection; + +import org.elasticsearch.common.ParseField; +import org.elasticsearch.common.unit.TimeValue; +import org.elasticsearch.common.xcontent.ConstructingObjectParser; +import org.elasticsearch.common.xcontent.ToXContentObject; +import org.elasticsearch.common.xcontent.XContentBuilder; + +import java.io.IOException; +import java.util.Objects; + +public class TimingStats implements ToXContentObject { + + public static final ParseField ELAPSED_TIME = new ParseField("elapsed_time"); + + public static ConstructingObjectParser PARSER = new ConstructingObjectParser<>("outlier_detection_timing_stats", + true, + a -> new TimingStats(a[0] == null ? null : TimeValue.timeValueMillis((long) a[0]))); + + static { + PARSER.declareLong(ConstructingObjectParser.optionalConstructorArg(), ELAPSED_TIME); + } + + private final TimeValue elapsedTime; + + public TimingStats(TimeValue elapsedTime) { + this.elapsedTime = elapsedTime; + } + + public TimeValue getElapsedTime() { + return elapsedTime; + } + + @Override + public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { + builder.startObject(); + if (elapsedTime != null) { + builder.humanReadableField(ELAPSED_TIME.getPreferredName(), ELAPSED_TIME.getPreferredName() + "_string", elapsedTime); + } + builder.endObject(); + return builder; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + TimingStats that = (TimingStats) o; + return Objects.equals(elapsedTime, that.elapsedTime); + } + + @Override + public int hashCode() { + return Objects.hash(elapsedTime); + } +} diff --git a/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/regression/Hyperparameters.java b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/regression/Hyperparameters.java new file mode 100644 index 0000000000000..cb1a0b99ab58b --- /dev/null +++ b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/regression/Hyperparameters.java @@ -0,0 +1,278 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.elasticsearch.client.ml.dataframe.stats.regression; + +import org.elasticsearch.common.ParseField; +import org.elasticsearch.common.xcontent.ConstructingObjectParser; +import org.elasticsearch.common.xcontent.ToXContentObject; +import org.elasticsearch.common.xcontent.XContentBuilder; + +import java.io.IOException; +import java.util.Objects; + +import static org.elasticsearch.common.xcontent.ConstructingObjectParser.optionalConstructorArg; + +public class Hyperparameters implements ToXContentObject { + + public static final ParseField DOWNSAMPLE_FACTOR = new ParseField("downsample_factor"); + public static final ParseField ETA = new ParseField("eta"); + public static final ParseField ETA_GROWTH_RATE_PER_TREE = new ParseField("eta_growth_rate_per_tree"); + public static final ParseField FEATURE_BAG_FRACTION = new ParseField("feature_bag_fraction"); + public static final ParseField MAX_ATTEMPTS_TO_ADD_TREE = new ParseField("max_attempts_to_add_tree"); + public static final ParseField MAX_OPTIMIZATION_ROUNDS_PER_HYPERPARAMETER = new ParseField( + "max_optimization_rounds_per_hyperparameter"); + public static final ParseField MAX_TREES = new ParseField("max_trees"); + public static final ParseField NUM_FOLDS = new ParseField("num_folds"); + public static final ParseField NUM_SPLITS_PER_FEATURE = new ParseField("num_splits_per_feature"); + public static final ParseField REGULARIZATION_DEPTH_PENALTY_MULTIPLIER = new ParseField("regularization_depth_penalty_multiplier"); + public static final ParseField REGULARIZATION_LEAF_WEIGHT_PENALTY_MULTIPLIER + = new ParseField("regularization_leaf_weight_penalty_multiplier"); + public static final ParseField REGULARIZATION_SOFT_TREE_DEPTH_LIMIT = new ParseField("regularization_soft_tree_depth_limit"); + public static final ParseField REGULARIZATION_SOFT_TREE_DEPTH_TOLERANCE = new ParseField("regularization_soft_tree_depth_tolerance"); + public static final ParseField REGULARIZATION_TREE_SIZE_PENALTY_MULTIPLIER = + new ParseField("regularization_tree_size_penalty_multiplier"); + + public static ConstructingObjectParser PARSER = new ConstructingObjectParser<>("regression_hyperparameters", + true, + a -> new Hyperparameters( + (Double) a[0], + (Double) a[1], + (Double) a[2], + (Double) a[3], + (Integer) a[4], + (Integer) a[5], + (Integer) a[6], + (Integer) a[7], + (Integer) a[8], + (Double) a[9], + (Double) a[10], + (Double) a[11], + (Double) a[12], + (Double) a[13] + )); + + static { + PARSER.declareDouble(optionalConstructorArg(), DOWNSAMPLE_FACTOR); + PARSER.declareDouble(optionalConstructorArg(), ETA); + PARSER.declareDouble(optionalConstructorArg(), ETA_GROWTH_RATE_PER_TREE); + PARSER.declareDouble(optionalConstructorArg(), FEATURE_BAG_FRACTION); + PARSER.declareInt(optionalConstructorArg(), MAX_ATTEMPTS_TO_ADD_TREE); + PARSER.declareInt(optionalConstructorArg(), MAX_OPTIMIZATION_ROUNDS_PER_HYPERPARAMETER); + PARSER.declareInt(optionalConstructorArg(), MAX_TREES); + PARSER.declareInt(optionalConstructorArg(), NUM_FOLDS); + PARSER.declareInt(optionalConstructorArg(), NUM_SPLITS_PER_FEATURE); + PARSER.declareDouble(optionalConstructorArg(), REGULARIZATION_DEPTH_PENALTY_MULTIPLIER); + PARSER.declareDouble(optionalConstructorArg(), REGULARIZATION_LEAF_WEIGHT_PENALTY_MULTIPLIER); + PARSER.declareDouble(optionalConstructorArg(), REGULARIZATION_SOFT_TREE_DEPTH_LIMIT); + PARSER.declareDouble(optionalConstructorArg(), REGULARIZATION_SOFT_TREE_DEPTH_TOLERANCE); + PARSER.declareDouble(optionalConstructorArg(), REGULARIZATION_TREE_SIZE_PENALTY_MULTIPLIER); + } + + private final Double downsampleFactor; + private final Double eta; + private final Double etaGrowthRatePerTree; + private final Double featureBagFraction; + private final Integer maxAttemptsToAddTree; + private final Integer maxOptimizationRoundsPerHyperparameter; + private final Integer maxTrees; + private final Integer numFolds; + private final Integer numSplitsPerFeature; + private final Double regularizationDepthPenaltyMultiplier; + private final Double regularizationLeafWeightPenaltyMultiplier; + private final Double regularizationSoftTreeDepthLimit; + private final Double regularizationSoftTreeDepthTolerance; + private final Double regularizationTreeSizePenaltyMultiplier; + + public Hyperparameters(Double downsampleFactor, + Double eta, + Double etaGrowthRatePerTree, + Double featureBagFraction, + Integer maxAttemptsToAddTree, + Integer maxOptimizationRoundsPerHyperparameter, + Integer maxTrees, + Integer numFolds, + Integer numSplitsPerFeature, + Double regularizationDepthPenaltyMultiplier, + Double regularizationLeafWeightPenaltyMultiplier, + Double regularizationSoftTreeDepthLimit, + Double regularizationSoftTreeDepthTolerance, + Double regularizationTreeSizePenaltyMultiplier) { + this.downsampleFactor = downsampleFactor; + this.eta = eta; + this.etaGrowthRatePerTree = etaGrowthRatePerTree; + this.featureBagFraction = featureBagFraction; + this.maxAttemptsToAddTree = maxAttemptsToAddTree; + this.maxOptimizationRoundsPerHyperparameter = maxOptimizationRoundsPerHyperparameter; + this.maxTrees = maxTrees; + this.numFolds = numFolds; + this.numSplitsPerFeature = numSplitsPerFeature; + this.regularizationDepthPenaltyMultiplier = regularizationDepthPenaltyMultiplier; + this.regularizationLeafWeightPenaltyMultiplier = regularizationLeafWeightPenaltyMultiplier; + this.regularizationSoftTreeDepthLimit = regularizationSoftTreeDepthLimit; + this.regularizationSoftTreeDepthTolerance = regularizationSoftTreeDepthTolerance; + this.regularizationTreeSizePenaltyMultiplier = regularizationTreeSizePenaltyMultiplier; + } + + public Double getDownsampleFactor() { + return downsampleFactor; + } + + public Double getEta() { + return eta; + } + + public Double getEtaGrowthRatePerTree() { + return etaGrowthRatePerTree; + } + + public Double getFeatureBagFraction() { + return featureBagFraction; + } + + public Integer getMaxAttemptsToAddTree() { + return maxAttemptsToAddTree; + } + + public Integer getMaxOptimizationRoundsPerHyperparameter() { + return maxOptimizationRoundsPerHyperparameter; + } + + public Integer getMaxTrees() { + return maxTrees; + } + + public Integer getNumFolds() { + return numFolds; + } + + public Integer getNumSplitsPerFeature() { + return numSplitsPerFeature; + } + + public Double getRegularizationDepthPenaltyMultiplier() { + return regularizationDepthPenaltyMultiplier; + } + + public Double getRegularizationLeafWeightPenaltyMultiplier() { + return regularizationLeafWeightPenaltyMultiplier; + } + + public Double getRegularizationSoftTreeDepthLimit() { + return regularizationSoftTreeDepthLimit; + } + + public Double getRegularizationSoftTreeDepthTolerance() { + return regularizationSoftTreeDepthTolerance; + } + + public Double getRegularizationTreeSizePenaltyMultiplier() { + return regularizationTreeSizePenaltyMultiplier; + } + + @Override + public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { + builder.startObject(); + if (downsampleFactor != null) { + builder.field(DOWNSAMPLE_FACTOR.getPreferredName(), downsampleFactor); + } + if (eta != null) { + builder.field(ETA.getPreferredName(), eta); + } + if (etaGrowthRatePerTree != null) { + builder.field(ETA_GROWTH_RATE_PER_TREE.getPreferredName(), etaGrowthRatePerTree); + } + if (featureBagFraction != null) { + builder.field(FEATURE_BAG_FRACTION.getPreferredName(), featureBagFraction); + } + if (maxAttemptsToAddTree != null) { + builder.field(MAX_ATTEMPTS_TO_ADD_TREE.getPreferredName(), maxAttemptsToAddTree); + } + if (maxOptimizationRoundsPerHyperparameter != null) { + builder.field(MAX_OPTIMIZATION_ROUNDS_PER_HYPERPARAMETER.getPreferredName(), maxOptimizationRoundsPerHyperparameter); + } + if (maxTrees != null) { + builder.field(MAX_TREES.getPreferredName(), maxTrees); + } + if (numFolds != null) { + builder.field(NUM_FOLDS.getPreferredName(), numFolds); + } + if (numSplitsPerFeature != null) { + builder.field(NUM_SPLITS_PER_FEATURE.getPreferredName(), numSplitsPerFeature); + } + if (regularizationDepthPenaltyMultiplier != null) { + builder.field(REGULARIZATION_DEPTH_PENALTY_MULTIPLIER.getPreferredName(), regularizationDepthPenaltyMultiplier); + } + if (regularizationLeafWeightPenaltyMultiplier != null) { + builder.field(REGULARIZATION_LEAF_WEIGHT_PENALTY_MULTIPLIER.getPreferredName(), regularizationLeafWeightPenaltyMultiplier); + } + if (regularizationSoftTreeDepthLimit != null) { + builder.field(REGULARIZATION_SOFT_TREE_DEPTH_LIMIT.getPreferredName(), regularizationSoftTreeDepthLimit); + } + if (regularizationSoftTreeDepthTolerance != null) { + builder.field(REGULARIZATION_SOFT_TREE_DEPTH_TOLERANCE.getPreferredName(), regularizationSoftTreeDepthTolerance); + } + if (regularizationTreeSizePenaltyMultiplier != null) { + builder.field(REGULARIZATION_TREE_SIZE_PENALTY_MULTIPLIER.getPreferredName(), regularizationTreeSizePenaltyMultiplier); + } + builder.endObject(); + return builder; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + + Hyperparameters that = (Hyperparameters) o; + return Objects.equals(downsampleFactor, that.downsampleFactor) + && Objects.equals(eta, that.eta) + && Objects.equals(etaGrowthRatePerTree, that.etaGrowthRatePerTree) + && Objects.equals(featureBagFraction, that.featureBagFraction) + && Objects.equals(maxAttemptsToAddTree, that.maxAttemptsToAddTree) + && Objects.equals(maxOptimizationRoundsPerHyperparameter, that.maxOptimizationRoundsPerHyperparameter) + && Objects.equals(maxTrees, that.maxTrees) + && Objects.equals(numFolds, that.numFolds) + && Objects.equals(numSplitsPerFeature, that.numSplitsPerFeature) + && Objects.equals(regularizationDepthPenaltyMultiplier, that.regularizationDepthPenaltyMultiplier) + && Objects.equals(regularizationLeafWeightPenaltyMultiplier, that.regularizationLeafWeightPenaltyMultiplier) + && Objects.equals(regularizationSoftTreeDepthLimit, that.regularizationSoftTreeDepthLimit) + && Objects.equals(regularizationSoftTreeDepthTolerance, that.regularizationSoftTreeDepthTolerance) + && Objects.equals(regularizationTreeSizePenaltyMultiplier, that.regularizationTreeSizePenaltyMultiplier); + } + + @Override + public int hashCode() { + return Objects.hash( + downsampleFactor, + eta, + etaGrowthRatePerTree, + featureBagFraction, + maxAttemptsToAddTree, + maxOptimizationRoundsPerHyperparameter, + maxTrees, + numFolds, + numSplitsPerFeature, + regularizationDepthPenaltyMultiplier, + regularizationLeafWeightPenaltyMultiplier, + regularizationSoftTreeDepthLimit, + regularizationSoftTreeDepthTolerance, + regularizationTreeSizePenaltyMultiplier + ); + } +} diff --git a/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/regression/RegressionStats.java b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/regression/RegressionStats.java new file mode 100644 index 0000000000000..7e890c3618f82 --- /dev/null +++ b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/regression/RegressionStats.java @@ -0,0 +1,135 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.elasticsearch.client.ml.dataframe.stats.regression; + +import org.elasticsearch.client.common.TimeUtil; +import org.elasticsearch.client.ml.dataframe.stats.AnalysisStats; +import org.elasticsearch.common.ParseField; +import org.elasticsearch.common.xcontent.ConstructingObjectParser; +import org.elasticsearch.common.xcontent.ObjectParser; +import org.elasticsearch.common.xcontent.ToXContent; +import org.elasticsearch.common.xcontent.XContentBuilder; + +import java.io.IOException; +import java.time.Instant; +import java.util.Objects; + +public class RegressionStats implements AnalysisStats { + + public static final ParseField NAME = new ParseField("regression_stats"); + + public static final ParseField TIMESTAMP = new ParseField("timestamp"); + public static final ParseField ITERATION = new ParseField("iteration"); + public static final ParseField HYPERPARAMETERS = new ParseField("hyperparameters"); + public static final ParseField TIMING_STATS = new ParseField("timing_stats"); + public static final ParseField VALIDATION_LOSS = new ParseField("validation_loss"); + + public static final ConstructingObjectParser PARSER = new ConstructingObjectParser<>(NAME.getPreferredName(), + true, + a -> new RegressionStats( + (Instant) a[0], + (Integer) a[1], + (Hyperparameters) a[2], + (TimingStats) a[3], + (ValidationLoss) a[4] + ) + ); + + static { + PARSER.declareField(ConstructingObjectParser.constructorArg(), + p -> TimeUtil.parseTimeFieldToInstant(p, TIMESTAMP.getPreferredName()), + TIMESTAMP, + ObjectParser.ValueType.VALUE); + PARSER.declareInt(ConstructingObjectParser.optionalConstructorArg(), ITERATION); + PARSER.declareObject(ConstructingObjectParser.constructorArg(), Hyperparameters.PARSER, HYPERPARAMETERS); + PARSER.declareObject(ConstructingObjectParser.constructorArg(), TimingStats.PARSER, TIMING_STATS); + PARSER.declareObject(ConstructingObjectParser.constructorArg(), ValidationLoss.PARSER, VALIDATION_LOSS); + } + + private final Instant timestamp; + private final Integer iteration; + private final Hyperparameters hyperparameters; + private final TimingStats timingStats; + private final ValidationLoss validationLoss; + + public RegressionStats(Instant timestamp, Integer iteration, Hyperparameters hyperparameters, TimingStats timingStats, + ValidationLoss validationLoss) { + this.timestamp = Instant.ofEpochMilli(Objects.requireNonNull(timestamp).toEpochMilli()); + this.iteration = iteration; + this.hyperparameters = Objects.requireNonNull(hyperparameters); + this.timingStats = Objects.requireNonNull(timingStats); + this.validationLoss = Objects.requireNonNull(validationLoss); + } + + public Instant getTimestamp() { + return timestamp; + } + + public Integer getIteration() { + return iteration; + } + + public Hyperparameters getHyperparameters() { + return hyperparameters; + } + + public TimingStats getTimingStats() { + return timingStats; + } + + public ValidationLoss getValidationLoss() { + return validationLoss; + } + + @Override + public XContentBuilder toXContent(XContentBuilder builder, ToXContent.Params params) throws IOException { + builder.startObject(); + builder.timeField(TIMESTAMP.getPreferredName(), TIMESTAMP.getPreferredName() + "_string", timestamp.toEpochMilli()); + if (iteration != null) { + builder.field(ITERATION.getPreferredName(), iteration); + } + builder.field(HYPERPARAMETERS.getPreferredName(), hyperparameters); + builder.field(TIMING_STATS.getPreferredName(), timingStats); + builder.field(VALIDATION_LOSS.getPreferredName(), validationLoss); + builder.endObject(); + return builder; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + RegressionStats that = (RegressionStats) o; + return Objects.equals(timestamp, that.timestamp) + && Objects.equals(iteration, that.iteration) + && Objects.equals(hyperparameters, that.hyperparameters) + && Objects.equals(timingStats, that.timingStats) + && Objects.equals(validationLoss, that.validationLoss); + } + + @Override + public int hashCode() { + return Objects.hash(timestamp, iteration, hyperparameters, timingStats, validationLoss); + } + + @Override + public String getName() { + return NAME.getPreferredName(); + } +} diff --git a/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/regression/TimingStats.java b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/regression/TimingStats.java new file mode 100644 index 0000000000000..1a844a410f469 --- /dev/null +++ b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/regression/TimingStats.java @@ -0,0 +1,87 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.elasticsearch.client.ml.dataframe.stats.regression; + +import org.elasticsearch.common.ParseField; +import org.elasticsearch.common.unit.TimeValue; +import org.elasticsearch.common.xcontent.ConstructingObjectParser; +import org.elasticsearch.common.xcontent.ToXContentObject; +import org.elasticsearch.common.xcontent.XContentBuilder; + +import java.io.IOException; +import java.util.Objects; + +public class TimingStats implements ToXContentObject { + + public static final ParseField ELAPSED_TIME = new ParseField("elapsed_time"); + public static final ParseField ITERATION_TIME = new ParseField("iteration_time"); + + public static ConstructingObjectParser PARSER = new ConstructingObjectParser<>("regression_timing_stats", true, + a -> new TimingStats( + a[0] == null ? null : TimeValue.timeValueMillis((long) a[0]), + a[1] == null ? null : TimeValue.timeValueMillis((long) a[1]) + )); + + static { + PARSER.declareLong(ConstructingObjectParser.optionalConstructorArg(), ELAPSED_TIME); + PARSER.declareLong(ConstructingObjectParser.optionalConstructorArg(), ITERATION_TIME); + } + + private final TimeValue elapsedTime; + private final TimeValue iterationTime; + + public TimingStats(TimeValue elapsedTime, TimeValue iterationTime) { + this.elapsedTime = elapsedTime; + this.iterationTime = iterationTime; + } + + public TimeValue getElapsedTime() { + return elapsedTime; + } + + public TimeValue getIterationTime() { + return iterationTime; + } + + @Override + public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { + builder.startObject(); + if (elapsedTime != null) { + builder.humanReadableField(ELAPSED_TIME.getPreferredName(), ELAPSED_TIME.getPreferredName() + "_string", elapsedTime); + } + if (iterationTime != null) { + builder.humanReadableField(ITERATION_TIME.getPreferredName(), ITERATION_TIME.getPreferredName() + "_string", iterationTime); + } + builder.endObject(); + return builder; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + TimingStats that = (TimingStats) o; + return Objects.equals(elapsedTime, that.elapsedTime) && Objects.equals(iterationTime, that.iterationTime); + } + + @Override + public int hashCode() { + return Objects.hash(elapsedTime, iterationTime); + } +} diff --git a/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/regression/ValidationLoss.java b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/regression/ValidationLoss.java new file mode 100644 index 0000000000000..ee2513b0f395f --- /dev/null +++ b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/dataframe/stats/regression/ValidationLoss.java @@ -0,0 +1,87 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.elasticsearch.client.ml.dataframe.stats.regression; + +import org.elasticsearch.client.ml.dataframe.stats.common.FoldValues; +import org.elasticsearch.common.ParseField; +import org.elasticsearch.common.xcontent.ConstructingObjectParser; +import org.elasticsearch.common.xcontent.ToXContentObject; +import org.elasticsearch.common.xcontent.XContentBuilder; + +import java.io.IOException; +import java.util.List; +import java.util.Objects; + +public class ValidationLoss implements ToXContentObject { + + public static final ParseField LOSS_TYPE = new ParseField("loss_type"); + public static final ParseField FOLD_VALUES = new ParseField("fold_values"); + + @SuppressWarnings("unchecked") + public static ConstructingObjectParser PARSER = new ConstructingObjectParser<>("regression_validation_loss", + true, + a -> new ValidationLoss((String) a[0], (List) a[1])); + + static { + PARSER.declareString(ConstructingObjectParser.optionalConstructorArg(), LOSS_TYPE); + PARSER.declareObjectArray(ConstructingObjectParser.optionalConstructorArg(), FoldValues.PARSER, FOLD_VALUES); + } + + private final String lossType; + private final List foldValues; + + public ValidationLoss(String lossType, List values) { + this.lossType = lossType; + this.foldValues = values; + } + + public String getLossType() { + return lossType; + } + + public List getFoldValues() { + return foldValues; + } + + @Override + public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { + builder.startObject(); + if (lossType != null) { + builder.field(LOSS_TYPE.getPreferredName(), lossType); + } + if (foldValues != null) { + builder.field(FOLD_VALUES.getPreferredName(), foldValues); + } + builder.endObject(); + return builder; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + ValidationLoss that = (ValidationLoss) o; + return Objects.equals(lossType, that.lossType) && Objects.equals(foldValues, that.foldValues); + } + + @Override + public int hashCode() { + return Objects.hash(lossType, foldValues); + } +} diff --git a/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/job/config/RuleScope.java b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/job/config/RuleScope.java index 8b6886d582524..95e727b818abe 100644 --- a/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/job/config/RuleScope.java +++ b/client/rest-high-level/src/main/java/org/elasticsearch/client/ml/job/config/RuleScope.java @@ -50,7 +50,7 @@ public static ContextParser parser() { Map value = (Map) entry.getValue(); builder.map(value); try (XContentParser scopeParser = XContentFactory.xContent(builder.contentType()).createParser( - NamedXContentRegistry.EMPTY, DEPRECATION_HANDLER, Strings.toString(builder))) { + NamedXContentRegistry.EMPTY, DeprecationHandler.IGNORE_DEPRECATIONS, Strings.toString(builder))) { scope.put(entry.getKey(), FilterRef.PARSER.parse(scopeParser, null)); } } @@ -59,15 +59,6 @@ public static ContextParser parser() { }; } - private static final DeprecationHandler DEPRECATION_HANDLER = new DeprecationHandler() { - - @Override - public void usedDeprecatedName(String usedName, String modernName) {} - - @Override - public void usedDeprecatedField(String usedName, String replacedWith) {} - }; - private final Map scope; public RuleScope() { diff --git a/client/rest-high-level/src/main/java/org/elasticsearch/client/rollup/GetRollupJobResponse.java b/client/rest-high-level/src/main/java/org/elasticsearch/client/rollup/GetRollupJobResponse.java index e63daf5949002..9ded34aa05670 100644 --- a/client/rest-high-level/src/main/java/org/elasticsearch/client/rollup/GetRollupJobResponse.java +++ b/client/rest-high-level/src/main/java/org/elasticsearch/client/rollup/GetRollupJobResponse.java @@ -177,16 +177,18 @@ public final String toString() { public static class RollupIndexerJobStats extends IndexerJobStats { RollupIndexerJobStats(long numPages, long numInputDocuments, long numOuputDocuments, long numInvocations, - long indexTime, long indexTotal, long searchTime, long searchTotal, long indexFailures, long searchFailures) { + long indexTime, long indexTotal, long searchTime, long searchTotal, long processingTime, + long processingTotal, long indexFailures, long searchFailures) { super(numPages, numInputDocuments, numOuputDocuments, numInvocations, - indexTime, searchTime, indexTotal, searchTotal, indexFailures, searchFailures); + indexTime, searchTime, processingTime, indexTotal, searchTotal, processingTotal, indexFailures, searchFailures); } private static final ConstructingObjectParser PARSER = new ConstructingObjectParser<>( STATS.getPreferredName(), true, args -> new RollupIndexerJobStats((long) args[0], (long) args[1], (long) args[2], (long) args[3], - (long) args[4], (long) args[5], (long) args[6], (long) args[7], (long) args[8], (long) args[9])); + (long) args[4], (long) args[5], (long) args[6], (long) args[7], (long) args[8], (long) args[9], + (long) args[10], (long) args[11])); static { PARSER.declareLong(constructorArg(), NUM_PAGES); PARSER.declareLong(constructorArg(), NUM_INPUT_DOCUMENTS); @@ -196,6 +198,8 @@ public static class RollupIndexerJobStats extends IndexerJobStats { PARSER.declareLong(constructorArg(), INDEX_TOTAL); PARSER.declareLong(constructorArg(), SEARCH_TIME_IN_MS); PARSER.declareLong(constructorArg(), SEARCH_TOTAL); + PARSER.declareLong(constructorArg(), PROCESSING_TIME_IN_MS); + PARSER.declareLong(constructorArg(), PROCESSING_TOTAL); PARSER.declareLong(constructorArg(), INDEX_FAILURES); PARSER.declareLong(constructorArg(), SEARCH_FAILURES); } diff --git a/client/rest-high-level/src/main/java/org/elasticsearch/client/transform/PreviewTransformResponse.java b/client/rest-high-level/src/main/java/org/elasticsearch/client/transform/PreviewTransformResponse.java index 215d529f94993..12a37d1f9d791 100644 --- a/client/rest-high-level/src/main/java/org/elasticsearch/client/transform/PreviewTransformResponse.java +++ b/client/rest-high-level/src/main/java/org/elasticsearch/client/transform/PreviewTransformResponse.java @@ -19,40 +19,167 @@ package org.elasticsearch.client.transform; +import org.elasticsearch.action.admin.indices.alias.Alias; +import org.elasticsearch.client.indices.CreateIndexRequest; +import org.elasticsearch.common.ParseField; +import org.elasticsearch.common.settings.Settings; +import org.elasticsearch.common.xcontent.ConstructingObjectParser; import org.elasticsearch.common.xcontent.XContentParser; import java.io.IOException; +import java.util.Collections; +import java.util.HashSet; import java.util.List; import java.util.Map; import java.util.Objects; +import java.util.Set; + +import static org.elasticsearch.common.xcontent.ConstructingObjectParser.optionalConstructorArg; public class PreviewTransformResponse { - private static final String PREVIEW = "preview"; - private static final String MAPPINGS = "mappings"; + public static class GeneratedDestIndexSettings { + static final ParseField MAPPINGS = new ParseField("mappings"); + private static final ParseField SETTINGS = new ParseField("settings"); + private static final ParseField ALIASES = new ParseField("aliases"); - @SuppressWarnings("unchecked") - public static PreviewTransformResponse fromXContent(final XContentParser parser) throws IOException { - Map previewMap = parser.mapOrdered(); - Object previewDocs = previewMap.get(PREVIEW); - Object mappings = previewMap.get(MAPPINGS); - return new PreviewTransformResponse((List>) previewDocs, (Map) mappings); + private final Map mappings; + private final Settings settings; + private final Set aliases; + + private static final ConstructingObjectParser PARSER = new ConstructingObjectParser<>( + "transform_preview_generated_dest_index", + true, + args -> { + @SuppressWarnings("unchecked") + Map mappings = (Map) args[0]; + Settings settings = (Settings) args[1]; + @SuppressWarnings("unchecked") + Set aliases = (Set) args[2]; + + return new GeneratedDestIndexSettings(mappings, settings, aliases); + } + ); + + static { + PARSER.declareObject(optionalConstructorArg(), (p, c) -> p.mapOrdered(), MAPPINGS); + PARSER.declareObject(optionalConstructorArg(), (p, c) -> Settings.fromXContent(p), SETTINGS); + PARSER.declareObject(optionalConstructorArg(), (p, c) -> { + Set aliases = new HashSet<>(); + while ((p.nextToken()) != XContentParser.Token.END_OBJECT) { + aliases.add(Alias.fromXContent(p)); + } + return aliases; + }, ALIASES); + } + + public GeneratedDestIndexSettings(Map mappings, Settings settings, Set aliases) { + this.mappings = mappings == null ? Collections.emptyMap() : Collections.unmodifiableMap(mappings); + this.settings = settings == null ? Settings.EMPTY : settings; + this.aliases = aliases == null ? Collections.emptySet() : Collections.unmodifiableSet(aliases); + } + + public Map getMappings() { + return mappings; + } + + public Settings getSettings() { + return settings; + } + + public Set getAliases() { + return aliases; + } + + public static GeneratedDestIndexSettings fromXContent(final XContentParser parser) { + return PARSER.apply(parser, null); + } + + @Override + public boolean equals(Object obj) { + if (obj == this) { + return true; + } + + if (obj == null || obj.getClass() != getClass()) { + return false; + } + + GeneratedDestIndexSettings other = (GeneratedDestIndexSettings) obj; + return Objects.equals(other.mappings, mappings) + && Objects.equals(other.settings, settings) + && Objects.equals(other.aliases, aliases); + } + + @Override + public int hashCode() { + return Objects.hash(mappings, settings, aliases); + } } - private List> docs; - private Map mappings; + public static final ParseField PREVIEW = new ParseField("preview"); + public static final ParseField GENERATED_DEST_INDEX_SETTINGS = new ParseField("generated_dest_index"); + + private final List> docs; + private final GeneratedDestIndexSettings generatedDestIndexSettings; + + private static final ConstructingObjectParser PARSER = new ConstructingObjectParser<>( + "data_frame_transform_preview", + true, + args -> { + @SuppressWarnings("unchecked") + List> docs = (List>) args[0]; + GeneratedDestIndexSettings generatedDestIndex = (GeneratedDestIndexSettings) args[1]; + + // ensure generatedDestIndex is not null + if (generatedDestIndex == null) { + // BWC parsing the output from nodes < 7.7 + @SuppressWarnings("unchecked") + Map mappings = (Map) args[2]; + generatedDestIndex = new GeneratedDestIndexSettings(mappings, null, null); + } - public PreviewTransformResponse(List> docs, Map mappings) { + return new PreviewTransformResponse(docs, generatedDestIndex); + } + ); + static { + PARSER.declareObjectArray(optionalConstructorArg(), (p, c) -> p.mapOrdered(), PREVIEW); + PARSER.declareObject(optionalConstructorArg(), (p, c) -> GeneratedDestIndexSettings.fromXContent(p), GENERATED_DEST_INDEX_SETTINGS); + PARSER.declareObject(optionalConstructorArg(), (p, c) -> p.mapOrdered(), GeneratedDestIndexSettings.MAPPINGS); + } + + public PreviewTransformResponse(List> docs, GeneratedDestIndexSettings generatedDestIndexSettings) { this.docs = docs; - this.mappings = mappings; + this.generatedDestIndexSettings = generatedDestIndexSettings; } public List> getDocs() { return docs; } + public GeneratedDestIndexSettings getGeneratedDestIndexSettings() { + return generatedDestIndexSettings; + } + public Map getMappings() { - return mappings; + return generatedDestIndexSettings.getMappings(); + } + + public Settings getSettings() { + return generatedDestIndexSettings.getSettings(); + } + + public Set getAliases() { + return generatedDestIndexSettings.getAliases(); + } + + public CreateIndexRequest getCreateIndexRequest(String index) { + CreateIndexRequest createIndexRequest = new CreateIndexRequest(index); + createIndexRequest.aliases(generatedDestIndexSettings.getAliases()); + createIndexRequest.settings(generatedDestIndexSettings.getSettings()); + createIndexRequest.mapping(generatedDestIndexSettings.getMappings()); + + return createIndexRequest; } @Override @@ -66,12 +193,15 @@ public boolean equals(Object obj) { } PreviewTransformResponse other = (PreviewTransformResponse) obj; - return Objects.equals(other.docs, docs) && Objects.equals(other.mappings, mappings); + return Objects.equals(other.docs, docs) && Objects.equals(other.generatedDestIndexSettings, generatedDestIndexSettings); } @Override public int hashCode() { - return Objects.hash(docs, mappings); + return Objects.hash(docs, generatedDestIndexSettings); } + public static PreviewTransformResponse fromXContent(final XContentParser parser) throws IOException { + return PARSER.parse(parser, null); + } } diff --git a/client/rest-high-level/src/main/java/org/elasticsearch/client/transform/transforms/TransformIndexerStats.java b/client/rest-high-level/src/main/java/org/elasticsearch/client/transform/transforms/TransformIndexerStats.java index 2a04c6ea45eb5..e3a0032e55b0d 100644 --- a/client/rest-high-level/src/main/java/org/elasticsearch/client/transform/transforms/TransformIndexerStats.java +++ b/client/rest-high-level/src/main/java/org/elasticsearch/client/transform/transforms/TransformIndexerStats.java @@ -27,7 +27,6 @@ import java.io.IOException; import java.util.Objects; -import static org.elasticsearch.common.xcontent.ConstructingObjectParser.constructorArg; import static org.elasticsearch.common.xcontent.ConstructingObjectParser.optionalConstructorArg; public class TransformIndexerStats extends IndexerJobStats { @@ -39,21 +38,38 @@ public class TransformIndexerStats extends IndexerJobStats { public static final ConstructingObjectParser LENIENT_PARSER = new ConstructingObjectParser<>( NAME, true, - args -> new TransformIndexerStats((long) args[0], (long) args[1], (long) args[2], - (long) args[3], (long) args[4], (long) args[5], (long) args[6], (long) args[7], (long) args[8], (long) args[9], - (Double) args[10], (Double) args[11], (Double) args[12])); + args -> new TransformIndexerStats( + unboxSafe(args[0], 0L), + unboxSafe(args[1], 0L), + unboxSafe(args[2], 0L), + unboxSafe(args[3], 0L), + unboxSafe(args[4], 0L), + unboxSafe(args[5], 0L), + unboxSafe(args[6], 0L), + unboxSafe(args[7], 0L), + unboxSafe(args[8], 0L), + unboxSafe(args[9], 0L), + unboxSafe(args[10], 0L), + unboxSafe(args[11], 0L), + unboxSafe(args[12], 0.0), + unboxSafe(args[13], 0.0), + unboxSafe(args[14], 0.0) + ) + ); static { - LENIENT_PARSER.declareLong(constructorArg(), NUM_PAGES); - LENIENT_PARSER.declareLong(constructorArg(), NUM_INPUT_DOCUMENTS); - LENIENT_PARSER.declareLong(constructorArg(), NUM_OUTPUT_DOCUMENTS); - LENIENT_PARSER.declareLong(constructorArg(), NUM_INVOCATIONS); - LENIENT_PARSER.declareLong(constructorArg(), INDEX_TIME_IN_MS); - LENIENT_PARSER.declareLong(constructorArg(), SEARCH_TIME_IN_MS); - LENIENT_PARSER.declareLong(constructorArg(), INDEX_TOTAL); - LENIENT_PARSER.declareLong(constructorArg(), SEARCH_TOTAL); - LENIENT_PARSER.declareLong(constructorArg(), INDEX_FAILURES); - LENIENT_PARSER.declareLong(constructorArg(), SEARCH_FAILURES); + LENIENT_PARSER.declareLong(optionalConstructorArg(), NUM_PAGES); + LENIENT_PARSER.declareLong(optionalConstructorArg(), NUM_INPUT_DOCUMENTS); + LENIENT_PARSER.declareLong(optionalConstructorArg(), NUM_OUTPUT_DOCUMENTS); + LENIENT_PARSER.declareLong(optionalConstructorArg(), NUM_INVOCATIONS); + LENIENT_PARSER.declareLong(optionalConstructorArg(), INDEX_TIME_IN_MS); + LENIENT_PARSER.declareLong(optionalConstructorArg(), SEARCH_TIME_IN_MS); + LENIENT_PARSER.declareLong(optionalConstructorArg(), PROCESSING_TIME_IN_MS); + LENIENT_PARSER.declareLong(optionalConstructorArg(), INDEX_TOTAL); + LENIENT_PARSER.declareLong(optionalConstructorArg(), SEARCH_TOTAL); + LENIENT_PARSER.declareLong(optionalConstructorArg(), PROCESSING_TOTAL); + LENIENT_PARSER.declareLong(optionalConstructorArg(), INDEX_FAILURES); + LENIENT_PARSER.declareLong(optionalConstructorArg(), SEARCH_FAILURES); LENIENT_PARSER.declareDouble(optionalConstructorArg(), EXPONENTIAL_AVG_CHECKPOINT_DURATION_MS); LENIENT_PARSER.declareDouble(optionalConstructorArg(), EXPONENTIAL_AVG_DOCUMENTS_INDEXED); LENIENT_PARSER.declareDouble(optionalConstructorArg(), EXPONENTIAL_AVG_DOCUMENTS_PROCESSED); @@ -67,16 +83,40 @@ public static TransformIndexerStats fromXContent(XContentParser parser) throws I private final double expAvgDocumentsIndexed; private final double expAvgDocumentsProcessed; - public TransformIndexerStats(long numPages, long numInputDocuments, long numOuputDocuments, - long numInvocations, long indexTime, long searchTime, - long indexTotal, long searchTotal, long indexFailures, long searchFailures, - Double expAvgCheckpointDurationMs, Double expAvgDocumentsIndexed, - Double expAvgDocumentsProcessed) { - super(numPages, numInputDocuments, numOuputDocuments, numInvocations, indexTime, searchTime, - indexTotal, searchTotal, indexFailures, searchFailures); - this.expAvgCheckpointDurationMs = expAvgCheckpointDurationMs == null ? 0.0 : expAvgCheckpointDurationMs; - this.expAvgDocumentsIndexed = expAvgDocumentsIndexed == null ? 0.0 : expAvgDocumentsIndexed; - this.expAvgDocumentsProcessed = expAvgDocumentsProcessed == null ? 0.0 : expAvgDocumentsProcessed; + public TransformIndexerStats( + long numPages, + long numInputDocuments, + long numOuputDocuments, + long numInvocations, + long indexTime, + long searchTime, + long processingTime, + long indexTotal, + long searchTotal, + long processingTotal, + long indexFailures, + long searchFailures, + double expAvgCheckpointDurationMs, + double expAvgDocumentsIndexed, + double expAvgDocumentsProcessed + ) { + super( + numPages, + numInputDocuments, + numOuputDocuments, + numInvocations, + indexTime, + searchTime, + processingTime, + indexTotal, + searchTotal, + processingTotal, + indexFailures, + searchFailures + ); + this.expAvgCheckpointDurationMs = expAvgCheckpointDurationMs; + this.expAvgDocumentsIndexed = expAvgDocumentsIndexed; + this.expAvgDocumentsProcessed = expAvgDocumentsProcessed; } public double getExpAvgCheckpointDurationMs() { @@ -109,10 +149,12 @@ public boolean equals(Object other) { && Objects.equals(this.numInvocations, that.numInvocations) && Objects.equals(this.indexTime, that.indexTime) && Objects.equals(this.searchTime, that.searchTime) + && Objects.equals(this.processingTime, that.processingTime) && Objects.equals(this.indexFailures, that.indexFailures) && Objects.equals(this.searchFailures, that.searchFailures) && Objects.equals(this.indexTotal, that.indexTotal) && Objects.equals(this.searchTotal, that.searchTotal) + && Objects.equals(this.processingTotal, that.processingTotal) && Objects.equals(this.expAvgCheckpointDurationMs, that.expAvgCheckpointDurationMs) && Objects.equals(this.expAvgDocumentsIndexed, that.expAvgDocumentsIndexed) && Objects.equals(this.expAvgDocumentsProcessed, that.expAvgDocumentsProcessed); @@ -120,8 +162,31 @@ public boolean equals(Object other) { @Override public int hashCode() { - return Objects.hash(numPages, numInputDocuments, numOuputDocuments, numInvocations, - indexTime, searchTime, indexFailures, searchFailures, indexTotal, searchTotal, - expAvgCheckpointDurationMs, expAvgDocumentsIndexed, expAvgDocumentsProcessed); + return Objects.hash( + numPages, + numInputDocuments, + numOuputDocuments, + numInvocations, + indexTime, + searchTime, + processingTime, + indexFailures, + searchFailures, + indexTotal, + searchTotal, + processingTotal, + expAvgCheckpointDurationMs, + expAvgDocumentsIndexed, + expAvgDocumentsProcessed + ); + } + + @SuppressWarnings("unchecked") + private static T unboxSafe(Object l, T default_value) { + if (l == null) { + return default_value; + } else { + return (T) l; + } } } diff --git a/client/rest-high-level/src/main/resources/META-INF/services/org.elasticsearch.plugins.spi.NamedXContentProvider b/client/rest-high-level/src/main/resources/META-INF/services/org.elasticsearch.plugins.spi.NamedXContentProvider index 145d06bd46b76..9426b3d1bdde7 100644 --- a/client/rest-high-level/src/main/resources/META-INF/services/org.elasticsearch.plugins.spi.NamedXContentProvider +++ b/client/rest-high-level/src/main/resources/META-INF/services/org.elasticsearch.plugins.spi.NamedXContentProvider @@ -1,5 +1,6 @@ org.elasticsearch.client.ilm.IndexLifecycleNamedXContentProvider org.elasticsearch.client.ml.dataframe.MlDataFrameAnalysisNamedXContentProvider org.elasticsearch.client.ml.dataframe.evaluation.MlEvaluationNamedXContentProvider +org.elasticsearch.client.ml.dataframe.stats.AnalysisStatsNamedXContentProvider org.elasticsearch.client.ml.inference.MlInferenceNamedXContentProvider org.elasticsearch.client.transform.TransformNamedXContentProvider diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/AsyncSearchRequestConvertersTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/AsyncSearchRequestConvertersTests.java new file mode 100644 index 0000000000000..a3e2c0cea7d9c --- /dev/null +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/AsyncSearchRequestConvertersTests.java @@ -0,0 +1,150 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ + +package org.elasticsearch.client; + +import org.apache.http.client.methods.HttpDelete; +import org.apache.http.client.methods.HttpGet; +import org.apache.http.client.methods.HttpPost; +import org.elasticsearch.action.search.SearchType; +import org.elasticsearch.client.asyncsearch.DeleteAsyncSearchRequest; +import org.elasticsearch.client.asyncsearch.GetAsyncSearchRequest; +import org.elasticsearch.client.asyncsearch.SubmitAsyncSearchRequest; +import org.elasticsearch.common.Strings; +import org.elasticsearch.common.unit.TimeValue; +import org.elasticsearch.rest.action.search.RestSearchAction; +import org.elasticsearch.search.builder.SearchSourceBuilder; +import org.elasticsearch.test.ESTestCase; + +import java.util.HashMap; +import java.util.Locale; +import java.util.Map; +import java.util.StringJoiner; + +import static org.elasticsearch.client.RequestConvertersTests.createTestSearchSourceBuilder; +import static org.elasticsearch.client.RequestConvertersTests.setRandomIndicesOptions; + +public class AsyncSearchRequestConvertersTests extends ESTestCase { + + public void testSubmitAsyncSearch() throws Exception { + String[] indices = RequestConvertersTests.randomIndicesNames(0, 5); + Map expectedParams = new HashMap<>(); + SearchSourceBuilder searchSourceBuilder = createTestSearchSourceBuilder(); + SubmitAsyncSearchRequest submitRequest = new SubmitAsyncSearchRequest(searchSourceBuilder, indices); + + // the following parameters might be overwritten by random ones later, + // but we need to set these since they are the default we send over http + expectedParams.put("request_cache", "true"); + expectedParams.put("batched_reduce_size", "5"); + setRandomSearchParams(submitRequest, expectedParams); + setRandomIndicesOptions(submitRequest::setIndicesOptions, submitRequest::getIndicesOptions, expectedParams); + + if (randomBoolean()) { + boolean cleanOnCompletion = randomBoolean(); + submitRequest.setCleanOnCompletion(cleanOnCompletion); + expectedParams.put("clean_on_completion", Boolean.toString(cleanOnCompletion)); + } + if (randomBoolean()) { + TimeValue keepAlive = TimeValue.parseTimeValue(randomTimeValue(), "test"); + submitRequest.setKeepAlive(keepAlive); + expectedParams.put("keep_alive", keepAlive.getStringRep()); + } + if (randomBoolean()) { + TimeValue waitForCompletion = TimeValue.parseTimeValue(randomTimeValue(), "test"); + submitRequest.setWaitForCompletion(waitForCompletion); + expectedParams.put("wait_for_completion", waitForCompletion.getStringRep()); + } + + Request request = AsyncSearchRequestConverters.submitAsyncSearch(submitRequest); + StringJoiner endpoint = new StringJoiner("/", "/", ""); + String index = String.join(",", indices); + if (Strings.hasLength(index)) { + endpoint.add(index); + } + endpoint.add("_async_search"); + assertEquals(HttpPost.METHOD_NAME, request.getMethod()); + assertEquals(endpoint.toString(), request.getEndpoint()); + assertEquals(expectedParams, request.getParameters()); + RequestConvertersTests.assertToXContentBody(searchSourceBuilder, request.getEntity()); + } + + private static void setRandomSearchParams(SubmitAsyncSearchRequest request, Map expectedParams) { + expectedParams.put(RestSearchAction.TYPED_KEYS_PARAM, "true"); + if (randomBoolean()) { + request.setRouting(randomAlphaOfLengthBetween(3, 10)); + expectedParams.put("routing", request.getRouting()); + } + if (randomBoolean()) { + request.setPreference(randomAlphaOfLengthBetween(3, 10)); + expectedParams.put("preference", request.getPreference()); + } + if (randomBoolean()) { + request.setSearchType(randomFrom(SearchType.CURRENTLY_SUPPORTED)); + } + expectedParams.put("search_type", request.getSearchType().name().toLowerCase(Locale.ROOT)); + if (randomBoolean()) { + request.setAllowPartialSearchResults(randomBoolean()); + expectedParams.put("allow_partial_search_results", Boolean.toString(request.getAllowPartialSearchResults())); + } + if (randomBoolean()) { + request.setRequestCache(randomBoolean()); + expectedParams.put("request_cache", Boolean.toString(request.getRequestCache())); + } + if (randomBoolean()) { + request.setBatchedReduceSize(randomIntBetween(2, Integer.MAX_VALUE)); + } + expectedParams.put("batched_reduce_size", Integer.toString(request.getBatchedReduceSize())); + if (randomBoolean()) { + request.setMaxConcurrentShardRequests(randomIntBetween(1, Integer.MAX_VALUE)); + } + expectedParams.put("max_concurrent_shard_requests", Integer.toString(request.getMaxConcurrentShardRequests())); + } + + public void testGetAsyncSearch() throws Exception { + String id = randomAlphaOfLengthBetween(5, 10); + Map expectedParams = new HashMap<>(); + GetAsyncSearchRequest submitRequest = new GetAsyncSearchRequest(id); + if (randomBoolean()) { + TimeValue keepAlive = TimeValue.parseTimeValue(randomTimeValue(), "test"); + submitRequest.setKeepAlive(keepAlive); + expectedParams.put("keep_alive", keepAlive.getStringRep()); + } + if (randomBoolean()) { + TimeValue waitForCompletion = TimeValue.parseTimeValue(randomTimeValue(), "test"); + submitRequest.setWaitForCompletion(waitForCompletion); + expectedParams.put("wait_for_completion", waitForCompletion.getStringRep()); + } + + Request request = AsyncSearchRequestConverters.getAsyncSearch(submitRequest); + String endpoint = "/_async_search/" + id; + assertEquals(HttpGet.METHOD_NAME, request.getMethod()); + assertEquals(endpoint.toString(), request.getEndpoint()); + assertEquals(expectedParams, request.getParameters()); + } + + public void testDeleteAsyncSearch() throws Exception { + String id = randomAlphaOfLengthBetween(5, 10); + DeleteAsyncSearchRequest deleteRequest = new DeleteAsyncSearchRequest(id); + + Request request = AsyncSearchRequestConverters.deleteAsyncSearch(deleteRequest); + assertEquals(HttpDelete.METHOD_NAME, request.getMethod()); + assertEquals("/_async_search/" + id, request.getEndpoint()); + assertTrue(request.getParameters().isEmpty()); + } +} diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/MLRequestConvertersTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/MLRequestConvertersTests.java index 6c280fba5ab01..7f8ed9e61bd1f 100644 --- a/client/rest-high-level/src/test/java/org/elasticsearch/client/MLRequestConvertersTests.java +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/MLRequestConvertersTests.java @@ -91,6 +91,7 @@ import org.elasticsearch.client.ml.dataframe.DataFrameAnalyticsConfig; import org.elasticsearch.client.ml.dataframe.MlDataFrameAnalysisNamedXContentProvider; import org.elasticsearch.client.ml.dataframe.evaluation.MlEvaluationNamedXContentProvider; +import org.elasticsearch.client.ml.dataframe.stats.AnalysisStatsNamedXContentProvider; import org.elasticsearch.client.ml.filestructurefinder.FileStructure; import org.elasticsearch.client.ml.inference.MlInferenceNamedXContentProvider; import org.elasticsearch.client.ml.inference.TrainedModelConfig; @@ -1067,6 +1068,7 @@ protected NamedXContentRegistry xContentRegistry() { namedXContent.addAll(new MlDataFrameAnalysisNamedXContentProvider().getNamedXContentParsers()); namedXContent.addAll(new MlEvaluationNamedXContentProvider().getNamedXContentParsers()); namedXContent.addAll(new MlInferenceNamedXContentProvider().getNamedXContentParsers()); + namedXContent.addAll(new AnalysisStatsNamedXContentProvider().getNamedXContentParsers()); return new NamedXContentRegistry(namedXContent); } diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/MachineLearningIT.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/MachineLearningIT.java index 17a9c0cd1cfca..a1204823d749b 100644 --- a/client/rest-high-level/src/test/java/org/elasticsearch/client/MachineLearningIT.java +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/MachineLearningIT.java @@ -874,7 +874,7 @@ private String createExpiredData(String jobId) throws Exception { { // Index a randomly named unused state document String docId = "non_existing_job_" + randomFrom("model_state_1234567#1", "quantiles", "categorizer_state#1"); - IndexRequest indexRequest = new IndexRequest(".ml-state").id(docId); + IndexRequest indexRequest = new IndexRequest(".ml-state-000001").id(docId); indexRequest.source(Collections.emptyMap(), XContentType.JSON); indexRequest.setRefreshPolicy(WriteRequest.RefreshPolicy.IMMEDIATE); highLevelClient().index(indexRequest, RequestOptions.DEFAULT); @@ -944,8 +944,8 @@ public void testDeleteExpiredData() throws Exception { assertTrue(forecastExists(jobId, forecastId)); { - // Verify .ml-state contains the expected unused state document - Iterable hits = searchAll(".ml-state"); + // Verify .ml-state* contains the expected unused state document + Iterable hits = searchAll(".ml-state*"); List target = new ArrayList<>(); hits.forEach(target::add); long numMatches = target.stream() @@ -974,8 +974,8 @@ public void testDeleteExpiredData() throws Exception { assertFalse(forecastExists(jobId, forecastId)); { - // Verify .ml-state doesn't contain unused state documents - Iterable hits = searchAll(".ml-state"); + // Verify .ml-state* doesn't contain unused state documents + Iterable hits = searchAll(".ml-state*"); List hitList = new ArrayList<>(); hits.forEach(hitList::add); long numMatches = hitList.stream() diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/RequestConvertersTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/RequestConvertersTests.java index cdabbb5b4c6cb..1360b58e0b1ec 100644 --- a/client/rest-high-level/src/test/java/org/elasticsearch/client/RequestConvertersTests.java +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/RequestConvertersTests.java @@ -1021,12 +1021,34 @@ public void testSearchNullSource() throws IOException { public void testSearch() throws Exception { String searchEndpoint = randomFrom("_" + randomAlphaOfLength(5)); String[] indices = randomIndicesNames(0, 5); + Map expectedParams = new HashMap<>(); + SearchRequest searchRequest = createTestSearchRequest(indices, expectedParams); + + Request request = RequestConverters.search(searchRequest, searchEndpoint); + StringJoiner endpoint = new StringJoiner("/", "/", ""); + String index = String.join(",", indices); + if (Strings.hasLength(index)) { + endpoint.add(index); + } + endpoint.add(searchEndpoint); + assertEquals(HttpPost.METHOD_NAME, request.getMethod()); + assertEquals(endpoint.toString(), request.getEndpoint()); + assertEquals(expectedParams, request.getParameters()); + assertToXContentBody(searchRequest.source(), request.getEntity()); + } + + public static SearchRequest createTestSearchRequest(String[] indices, Map expectedParams) { SearchRequest searchRequest = new SearchRequest(indices); - Map expectedParams = new HashMap<>(); setRandomSearchParams(searchRequest, expectedParams); setRandomIndicesOptions(searchRequest::indicesOptions, searchRequest::indicesOptions, expectedParams); + SearchSourceBuilder searchSourceBuilder = createTestSearchSourceBuilder(); + searchRequest.source(searchSourceBuilder); + return searchRequest; + } + + public static SearchSourceBuilder createTestSearchSourceBuilder() { SearchSourceBuilder searchSourceBuilder = new SearchSourceBuilder(); // rarely skip setting the search source completely if (frequently()) { @@ -1071,22 +1093,11 @@ public void testSearch() throws Exception { searchSourceBuilder.collapse(new CollapseBuilder(randomAlphaOfLengthBetween(3, 10))); } } - searchRequest.source(searchSourceBuilder); } - - Request request = RequestConverters.search(searchRequest, searchEndpoint); - StringJoiner endpoint = new StringJoiner("/", "/", ""); - String index = String.join(",", indices); - if (Strings.hasLength(index)) { - endpoint.add(index); - } - endpoint.add(searchEndpoint); - assertEquals(HttpPost.METHOD_NAME, request.getMethod()); - assertEquals(endpoint.toString(), request.getEndpoint()); - assertEquals(expectedParams, request.getParameters()); - assertToXContentBody(searchSourceBuilder, request.getEntity()); + return searchSourceBuilder; } + public void testSearchNullIndicesAndTypes() { expectThrows(NullPointerException.class, () -> new SearchRequest((String[]) null)); expectThrows(NullPointerException.class, () -> new SearchRequest().indices((String[]) null)); @@ -1858,9 +1869,19 @@ private static void setRandomSearchParams(SearchRequest searchRequest, searchRequest.setCcsMinimizeRoundtrips(randomBoolean()); } expectedParams.put("ccs_minimize_roundtrips", Boolean.toString(searchRequest.isCcsMinimizeRoundtrips())); + if (randomBoolean()) { + searchRequest.setMaxConcurrentShardRequests(randomIntBetween(1, Integer.MAX_VALUE)); + } + expectedParams.put("max_concurrent_shard_requests", Integer.toString(searchRequest.getMaxConcurrentShardRequests())); + if (randomBoolean()) { + searchRequest.setPreFilterShardSize(randomIntBetween(2, Integer.MAX_VALUE)); + } + if (searchRequest.getPreFilterShardSize() != null) { + expectedParams.put("pre_filter_shard_size", Integer.toString(searchRequest.getPreFilterShardSize())); + } } - static void setRandomIndicesOptions(Consumer setter, Supplier getter, + public static void setRandomIndicesOptions(Consumer setter, Supplier getter, Map expectedParams) { if (randomBoolean()) { diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/RestHighLevelClientTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/RestHighLevelClientTests.java index a8e8037930741..e35adbc7aa5e4 100644 --- a/client/rest-high-level/src/test/java/org/elasticsearch/client/RestHighLevelClientTests.java +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/RestHighLevelClientTests.java @@ -20,7 +20,6 @@ package org.elasticsearch.client; import com.fasterxml.jackson.core.JsonParseException; - import org.apache.http.HttpEntity; import org.apache.http.HttpHost; import org.apache.http.HttpResponse; @@ -69,6 +68,9 @@ import org.elasticsearch.client.ml.dataframe.evaluation.softclassification.ConfusionMatrixMetric; import org.elasticsearch.client.ml.dataframe.evaluation.softclassification.PrecisionMetric; import org.elasticsearch.client.ml.dataframe.evaluation.softclassification.RecallMetric; +import org.elasticsearch.client.ml.dataframe.stats.classification.ClassificationStats; +import org.elasticsearch.client.ml.dataframe.stats.outlierdetection.OutlierDetectionStats; +import org.elasticsearch.client.ml.dataframe.stats.regression.RegressionStats; import org.elasticsearch.client.ml.inference.preprocessing.CustomWordEmbedding; import org.elasticsearch.client.ml.inference.preprocessing.FrequencyEncoding; import org.elasticsearch.client.ml.inference.preprocessing.OneHotEncoding; @@ -697,7 +699,7 @@ public void testDefaultNamedXContents() { public void testProvidedNamedXContents() { List namedXContents = RestHighLevelClient.getProvidedNamedXContents(); - assertEquals(59, namedXContents.size()); + assertEquals(62, namedXContents.size()); Map, Integer> categories = new HashMap<>(); List names = new ArrayList<>(); for (NamedXContentRegistry.Entry namedXContent : namedXContents) { @@ -707,7 +709,7 @@ public void testProvidedNamedXContents() { categories.put(namedXContent.categoryClass, counter + 1); } } - assertEquals("Had: " + categories, 12, categories.size()); + assertEquals("Had: " + categories, 13, categories.size()); assertEquals(Integer.valueOf(3), categories.get(Aggregation.class)); assertTrue(names.contains(ChildrenAggregationBuilder.NAME)); assertTrue(names.contains(MatrixStatsAggregationBuilder.NAME)); @@ -737,6 +739,9 @@ public void testProvidedNamedXContents() { assertTrue(names.contains(OutlierDetection.NAME.getPreferredName())); assertTrue(names.contains(org.elasticsearch.client.ml.dataframe.Regression.NAME.getPreferredName())); assertTrue(names.contains(org.elasticsearch.client.ml.dataframe.Classification.NAME.getPreferredName())); + assertTrue(names.contains(OutlierDetectionStats.NAME.getPreferredName())); + assertTrue(names.contains(RegressionStats.NAME.getPreferredName())); + assertTrue(names.contains(ClassificationStats.NAME.getPreferredName())); assertEquals(Integer.valueOf(1), categories.get(SyncConfig.class)); assertTrue(names.contains(TimeSyncConfig.NAME)); assertEquals(Integer.valueOf(3), categories.get(org.elasticsearch.client.ml.dataframe.evaluation.Evaluation.class)); @@ -790,7 +795,13 @@ public void testApiNamingConventions() throws Exception { "indices.get_upgrade", "indices.put_alias", "render_search_template", - "scripts_painless_execute" + "scripts_painless_execute", + "cluster.put_component_template", + "cluster.get_component_template", + "cluster.delete_component_template", + "indices.create_data_stream", + "indices.get_data_streams", + "indices.delete_data_stream" }; //These API are not required for high-level client feature completeness String[] notRequiredApi = new String[] { @@ -887,6 +898,7 @@ public void testApiNamingConventions() throws Exception { apiName.startsWith("eql.") == false && apiName.endsWith("freeze") == false && apiName.endsWith("reload_analyzers") == false && + apiName.startsWith("async_search") == false && // IndicesClientIT.getIndexTemplate should be renamed "getTemplate" in version 8.0 when we // can get rid of 7.0's deprecated "getTemplate" apiName.equals("indices.get_index_template") == false) { diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/TransformIT.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/TransformIT.java index 3f93806aca779..94341c41685f0 100644 --- a/client/rest-high-level/src/test/java/org/elasticsearch/client/TransformIT.java +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/TransformIT.java @@ -441,6 +441,8 @@ public void testGetStats() throws Exception { 0L, 0L, 0L, + 0L, + 0L, 0.0, 0.0, 0.0); diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/asyncsearch/AsyncSearchIT.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/asyncsearch/AsyncSearchIT.java new file mode 100644 index 0000000000000..38e7351e58836 --- /dev/null +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/asyncsearch/AsyncSearchIT.java @@ -0,0 +1,70 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ + +package org.elasticsearch.client.asyncsearch; + +import org.elasticsearch.client.ESRestHighLevelClientTestCase; +import org.elasticsearch.client.RequestOptions; +import org.elasticsearch.client.core.AcknowledgedResponse; +import org.elasticsearch.common.settings.Settings; +import org.elasticsearch.index.query.QueryBuilders; +import org.elasticsearch.search.builder.SearchSourceBuilder; + +import java.io.IOException; + +public class AsyncSearchIT extends ESRestHighLevelClientTestCase { + + public void testAsyncSearch() throws IOException { + String index = "test-index"; + createIndex(index, Settings.EMPTY); + + SearchSourceBuilder sourceBuilder = new SearchSourceBuilder().query(QueryBuilders.matchAllQuery()); + SubmitAsyncSearchRequest submitRequest = new SubmitAsyncSearchRequest(sourceBuilder, index); + submitRequest.setCleanOnCompletion(false); + AsyncSearchResponse submitResponse = highLevelClient().asyncSearch().submit(submitRequest, RequestOptions.DEFAULT); + assertNotNull(submitResponse.getId()); + assertFalse(submitResponse.isPartial()); + assertTrue(submitResponse.getStartTime() > 0); + assertTrue(submitResponse.getExpirationTime() > 0); + assertNotNull(submitResponse.getSearchResponse()); + if (submitResponse.isRunning() == false) { + assertFalse(submitResponse.isPartial()); + } else { + assertTrue(submitResponse.isPartial()); + } + + GetAsyncSearchRequest getRequest = new GetAsyncSearchRequest(submitResponse.getId()); + AsyncSearchResponse getResponse = highLevelClient().asyncSearch().get(getRequest, RequestOptions.DEFAULT); + while (getResponse.isRunning()) { + getResponse = highLevelClient().asyncSearch().get(getRequest, RequestOptions.DEFAULT); + } + + assertFalse(getResponse.isRunning()); + assertFalse(getResponse.isPartial()); + assertTrue(getResponse.getStartTime() > 0); + assertTrue(getResponse.getExpirationTime() > 0); + assertNotNull(getResponse.getSearchResponse()); + + DeleteAsyncSearchRequest deleteRequest = new DeleteAsyncSearchRequest(submitResponse.getId()); + AcknowledgedResponse deleteAsyncSearchResponse = highLevelClient().asyncSearch().delete(deleteRequest, + RequestOptions.DEFAULT); + assertNotNull(deleteAsyncSearchResponse); + assertNotNull(deleteAsyncSearchResponse.isAcknowledged()); + } +} diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/asyncsearch/AsyncSearchResponseTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/asyncsearch/AsyncSearchResponseTests.java new file mode 100644 index 0000000000000..08c0da25e5bb6 --- /dev/null +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/asyncsearch/AsyncSearchResponseTests.java @@ -0,0 +1,83 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.elasticsearch.client.asyncsearch; + +import org.elasticsearch.ElasticsearchException; +import org.elasticsearch.action.search.SearchResponse; +import org.elasticsearch.action.search.SearchResponse.Clusters; +import org.elasticsearch.action.search.ShardSearchFailure; +import org.elasticsearch.client.AbstractResponseTestCase; +import org.elasticsearch.common.xcontent.XContentParser; +import org.elasticsearch.common.xcontent.XContentType; +import org.elasticsearch.search.internal.InternalSearchResponse; + +import java.io.IOException; + +import static org.hamcrest.Matchers.containsString; + +public class AsyncSearchResponseTests + extends AbstractResponseTestCase { + + @Override + protected org.elasticsearch.xpack.core.search.action.AsyncSearchResponse createServerTestInstance(XContentType xContentType) { + boolean isPartial = randomBoolean(); + boolean isRunning = randomBoolean(); + long startTimeMillis = randomLongBetween(0, Long.MAX_VALUE); + long expirationTimeMillis = randomLongBetween(0, Long.MAX_VALUE); + String id = randomBoolean() ? null : randomAlphaOfLength(10); + ElasticsearchException error = randomBoolean() ? null : new ElasticsearchException(randomAlphaOfLength(10)); + // add search response, minimal object is okay since the full randomization of parsing is tested in SearchResponseTests + SearchResponse searchResponse = randomBoolean() ? null + : new SearchResponse(InternalSearchResponse.empty(), randomAlphaOfLength(10), 1, 1, 0, randomIntBetween(0, 10000), + ShardSearchFailure.EMPTY_ARRAY, Clusters.EMPTY); + org.elasticsearch.xpack.core.search.action.AsyncSearchResponse testResponse = + new org.elasticsearch.xpack.core.search.action.AsyncSearchResponse(id, searchResponse, error, isPartial, isRunning, + startTimeMillis, expirationTimeMillis); + return testResponse; + } + + @Override + protected AsyncSearchResponse doParseToClientInstance(XContentParser parser) throws IOException { + return AsyncSearchResponse.fromXContent(parser); + } + + @Override + protected void assertInstances(org.elasticsearch.xpack.core.search.action.AsyncSearchResponse expected, AsyncSearchResponse parsed) { + assertNotSame(parsed, expected); + assertEquals(expected.getId(), parsed.getId()); + assertEquals(expected.isRunning(), parsed.isRunning()); + assertEquals(expected.isPartial(), parsed.isPartial()); + assertEquals(expected.getStartTime(), parsed.getStartTime()); + assertEquals(expected.getExpirationTime(), parsed.getExpirationTime()); + // we cannot directly compare error since Exceptions are wrapped differently on parsing, but we can check original message + if (expected.getFailure() != null) { + assertThat(parsed.getFailure().getMessage(), containsString(expected.getFailure().getMessage())); + } else { + assertNull(parsed.getFailure()); + } + // we don't need to check the complete parsed search response since this is done elsewhere + // only spot-check some randomized properties for equality here + if (expected.getSearchResponse() != null) { + assertEquals(expected.getSearchResponse().getTook(), parsed.getSearchResponse().getTook()); + assertEquals(expected.getSearchResponse().getScrollId(), parsed.getSearchResponse().getScrollId()); + } else { + assertNull(parsed.getSearchResponse()); + } + } +} diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/asyncsearch/GetAsyncSearchRequestTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/asyncsearch/GetAsyncSearchRequestTests.java new file mode 100644 index 0000000000000..b6861b218cd28 --- /dev/null +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/asyncsearch/GetAsyncSearchRequestTests.java @@ -0,0 +1,41 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ + +package org.elasticsearch.client.asyncsearch; + +import org.elasticsearch.client.ValidationException; +import org.elasticsearch.common.unit.TimeValue; +import org.elasticsearch.test.ESTestCase; + +import java.util.concurrent.TimeUnit; + +public class GetAsyncSearchRequestTests extends ESTestCase { + + public void testValidation() { + GetAsyncSearchRequest getAsyncSearchRequest = new GetAsyncSearchRequest(randomAlphaOfLength(10)); + getAsyncSearchRequest.setKeepAlive(new TimeValue(0)); + assertTrue(getAsyncSearchRequest.validate().isPresent()); + ValidationException validationException = getAsyncSearchRequest.validate().get(); + assertEquals(1, validationException.validationErrors().size()); + assertEquals("Validation Failed: 1: keep_alive must be greater than 1 minute, got: 0s;", validationException.getMessage()); + + getAsyncSearchRequest.setKeepAlive(new TimeValue(1, TimeUnit.MINUTES)); + assertFalse(getAsyncSearchRequest.validate().isPresent()); + } +} diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/asyncsearch/SubmitAsyncSearchRequestTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/asyncsearch/SubmitAsyncSearchRequestTests.java new file mode 100644 index 0000000000000..f7075052cab2b --- /dev/null +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/asyncsearch/SubmitAsyncSearchRequestTests.java @@ -0,0 +1,57 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ + +package org.elasticsearch.client.asyncsearch; + +import org.elasticsearch.client.ValidationException; +import org.elasticsearch.common.unit.TimeValue; +import org.elasticsearch.search.builder.SearchSourceBuilder; +import org.elasticsearch.search.suggest.SuggestBuilder; +import org.elasticsearch.test.ESTestCase; + +import java.util.Optional; + +public class SubmitAsyncSearchRequestTests extends ESTestCase { + + public void testValidation() { + { + SearchSourceBuilder source = new SearchSourceBuilder(); + SubmitAsyncSearchRequest request = new SubmitAsyncSearchRequest(source, "test"); + Optional validation = request.validate(); + assertFalse(validation.isPresent()); + } + { + SearchSourceBuilder source = new SearchSourceBuilder().suggest(new SuggestBuilder()); + SubmitAsyncSearchRequest request = new SubmitAsyncSearchRequest(source, "test"); + Optional validation = request.validate(); + assertTrue(validation.isPresent()); + assertEquals(1, validation.get().validationErrors().size()); + assertEquals("suggest-only queries are not supported", validation.get().validationErrors().get(0)); + } + { + SubmitAsyncSearchRequest request = new SubmitAsyncSearchRequest(new SearchSourceBuilder(), "test"); + request.setKeepAlive(new TimeValue(1)); + Optional validation = request.validate(); + assertTrue(validation.isPresent()); + assertEquals(1, validation.get().validationErrors().size()); + assertEquals("[keep_alive] must be greater than 1 minute, got: 1ms", validation.get().validationErrors().get(0)); + } + } + +} diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/DataFrameAnalyticsStatsTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/DataFrameAnalyticsStatsTests.java index 48ebf71e36023..d251f568dfa79 100644 --- a/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/DataFrameAnalyticsStatsTests.java +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/DataFrameAnalyticsStatsTests.java @@ -20,6 +20,14 @@ package org.elasticsearch.client.ml.dataframe; import org.elasticsearch.client.ml.NodeAttributesTests; +import org.elasticsearch.client.ml.dataframe.stats.AnalysisStats; +import org.elasticsearch.client.ml.dataframe.stats.AnalysisStatsNamedXContentProvider; +import org.elasticsearch.client.ml.dataframe.stats.classification.ClassificationStatsTests; +import org.elasticsearch.client.ml.dataframe.stats.common.DataCountsTests; +import org.elasticsearch.client.ml.dataframe.stats.common.MemoryUsageTests; +import org.elasticsearch.client.ml.dataframe.stats.outlierdetection.OutlierDetectionStatsTests; +import org.elasticsearch.client.ml.dataframe.stats.regression.RegressionStatsTests; +import org.elasticsearch.common.xcontent.NamedXContentRegistry; import org.elasticsearch.common.xcontent.XContentBuilder; import org.elasticsearch.test.ESTestCase; @@ -31,23 +39,39 @@ public class DataFrameAnalyticsStatsTests extends ESTestCase { + @Override + protected NamedXContentRegistry xContentRegistry() { + List namedXContent = new ArrayList<>(); + namedXContent.addAll(new AnalysisStatsNamedXContentProvider().getNamedXContentParsers()); + return new NamedXContentRegistry(namedXContent); + } + public void testFromXContent() throws IOException { xContentTester(this::createParser, DataFrameAnalyticsStatsTests::randomDataFrameAnalyticsStats, DataFrameAnalyticsStatsTests::toXContent, DataFrameAnalyticsStats::fromXContent) .supportsUnknownFields(true) - .randomFieldsExcludeFilter(field -> field.startsWith("node.attributes")) + .randomFieldsExcludeFilter(field -> field.startsWith("node.attributes") || field.startsWith("analysis_stats")) .test(); } public static DataFrameAnalyticsStats randomDataFrameAnalyticsStats() { + AnalysisStats analysisStats = randomBoolean() ? null : + randomFrom( + ClassificationStatsTests.createRandom(), + OutlierDetectionStatsTests.createRandom(), + RegressionStatsTests.createRandom() + ); + return new DataFrameAnalyticsStats( randomAlphaOfLengthBetween(1, 10), randomFrom(DataFrameAnalyticsState.values()), randomBoolean() ? null : randomAlphaOfLength(10), randomBoolean() ? null : createRandomProgress(), + randomBoolean() ? null : DataCountsTests.createRandom(), randomBoolean() ? null : MemoryUsageTests.createRandom(), + analysisStats, randomBoolean() ? null : NodeAttributesTests.createRandom(), randomBoolean() ? null : randomAlphaOfLengthBetween(1, 20)); } @@ -71,9 +95,17 @@ public static void toXContent(DataFrameAnalyticsStats stats, XContentBuilder bui if (stats.getProgress() != null) { builder.field(DataFrameAnalyticsStats.PROGRESS.getPreferredName(), stats.getProgress()); } + if (stats.getDataCounts() != null) { + builder.field(DataFrameAnalyticsStats.DATA_COUNTS.getPreferredName(), stats.getDataCounts()); + } if (stats.getMemoryUsage() != null) { builder.field(DataFrameAnalyticsStats.MEMORY_USAGE.getPreferredName(), stats.getMemoryUsage()); } + if (stats.getAnalysisStats() != null) { + builder.startObject("analysis_stats"); + builder.field(stats.getAnalysisStats().getName(), stats.getAnalysisStats()); + builder.endObject(); + } if (stats.getNode() != null) { builder.field(DataFrameAnalyticsStats.NODE.getPreferredName(), stats.getNode()); } diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/classification/ClassificationStatsTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/classification/ClassificationStatsTests.java new file mode 100644 index 0000000000000..d23633c01d28a --- /dev/null +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/classification/ClassificationStatsTests.java @@ -0,0 +1,53 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.elasticsearch.client.ml.dataframe.stats.classification; + +import org.elasticsearch.common.xcontent.XContentParser; +import org.elasticsearch.test.AbstractXContentTestCase; + +import java.io.IOException; +import java.time.Instant; + +public class ClassificationStatsTests extends AbstractXContentTestCase { + + @Override + protected boolean supportsUnknownFields() { + return true; + } + + @Override + protected ClassificationStats doParseInstance(XContentParser parser) throws IOException { + return ClassificationStats.PARSER.apply(parser, null); + } + + @Override + protected ClassificationStats createTestInstance() { + return createRandom(); + } + + public static ClassificationStats createRandom() { + return new ClassificationStats( + Instant.now(), + randomBoolean() ? null : randomIntBetween(1, Integer.MAX_VALUE), + HyperparametersTests.createRandom(), + TimingStatsTests.createRandom(), + ValidationLossTests.createRandom() + ); + } +} diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/classification/HyperparametersTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/classification/HyperparametersTests.java new file mode 100644 index 0000000000000..aa1ab12c542ea --- /dev/null +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/classification/HyperparametersTests.java @@ -0,0 +1,62 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.elasticsearch.client.ml.dataframe.stats.classification; + +import org.elasticsearch.common.xcontent.XContentParser; +import org.elasticsearch.test.AbstractXContentTestCase; + +import java.io.IOException; + +public class HyperparametersTests extends AbstractXContentTestCase { + + @Override + protected boolean supportsUnknownFields() { + return true; + } + + @Override + protected Hyperparameters doParseInstance(XContentParser parser) throws IOException { + return Hyperparameters.PARSER.apply(parser, null); + } + + @Override + protected Hyperparameters createTestInstance() { + return createRandom(); + } + + public static Hyperparameters createRandom() { + return new Hyperparameters( + randomBoolean() ? null : randomAlphaOfLength(10), + randomBoolean() ? null : randomDouble(), + randomBoolean() ? null : randomDouble(), + randomBoolean() ? null : randomDouble(), + randomBoolean() ? null : randomDouble(), + randomBoolean() ? null : randomIntBetween(0, Integer.MAX_VALUE), + randomBoolean() ? null : randomIntBetween(0, Integer.MAX_VALUE), + randomBoolean() ? null : randomIntBetween(0, Integer.MAX_VALUE), + randomBoolean() ? null : randomIntBetween(0, Integer.MAX_VALUE), + randomBoolean() ? null : randomIntBetween(0, Integer.MAX_VALUE), + randomBoolean() ? null : randomDouble(), + randomBoolean() ? null : randomDouble(), + randomBoolean() ? null : randomDouble(), + randomBoolean() ? null : randomDouble(), + randomBoolean() ? null : randomDouble() + ); + } +} diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/classification/TimingStatsTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/classification/TimingStatsTests.java new file mode 100644 index 0000000000000..5e2c4c842e18d --- /dev/null +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/classification/TimingStatsTests.java @@ -0,0 +1,50 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.elasticsearch.client.ml.dataframe.stats.classification; + +import org.elasticsearch.common.unit.TimeValue; +import org.elasticsearch.common.xcontent.XContentParser; +import org.elasticsearch.test.AbstractXContentTestCase; + +import java.io.IOException; + +public class TimingStatsTests extends AbstractXContentTestCase { + + @Override + protected boolean supportsUnknownFields() { + return true; + } + + @Override + protected TimingStats doParseInstance(XContentParser parser) throws IOException { + return TimingStats.PARSER.apply(parser, null); + } + + @Override + protected TimingStats createTestInstance() { + return createRandom(); + } + + public static TimingStats createRandom() { + return new TimingStats( + randomBoolean() ? null : TimeValue.timeValueMillis(randomNonNegativeLong()), + randomBoolean() ? null : TimeValue.timeValueMillis(randomNonNegativeLong()) + ); + } +} diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/classification/ValidationLossTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/classification/ValidationLossTests.java new file mode 100644 index 0000000000000..c841af43d4393 --- /dev/null +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/classification/ValidationLossTests.java @@ -0,0 +1,50 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.elasticsearch.client.ml.dataframe.stats.classification; + +import org.elasticsearch.client.ml.dataframe.stats.common.FoldValuesTests; +import org.elasticsearch.common.xcontent.XContentParser; +import org.elasticsearch.test.AbstractXContentTestCase; + +import java.io.IOException; + +public class ValidationLossTests extends AbstractXContentTestCase { + + @Override + protected boolean supportsUnknownFields() { + return true; + } + + @Override + protected ValidationLoss doParseInstance(XContentParser parser) throws IOException { + return ValidationLoss.PARSER.apply(parser, null); + } + + @Override + protected ValidationLoss createTestInstance() { + return createRandom(); + } + + public static ValidationLoss createRandom() { + return new ValidationLoss( + randomBoolean() ? null : randomAlphaOfLength(10), + randomBoolean() ? null : randomList(5, () -> FoldValuesTests.createRandom()) + ); + } +} diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/common/DataCountsTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/common/DataCountsTests.java new file mode 100644 index 0000000000000..5e877e2d40f7b --- /dev/null +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/common/DataCountsTests.java @@ -0,0 +1,51 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ + +package org.elasticsearch.client.ml.dataframe.stats.common; + +import org.elasticsearch.common.xcontent.XContentParser; +import org.elasticsearch.test.AbstractXContentTestCase; + +import java.io.IOException; + +public class DataCountsTests extends AbstractXContentTestCase { + + @Override + protected DataCounts createTestInstance() { + return createRandom(); + } + + public static DataCounts createRandom() { + return new DataCounts( + randomNonNegativeLong(), + randomNonNegativeLong(), + randomNonNegativeLong() + ); + } + + @Override + protected DataCounts doParseInstance(XContentParser parser) throws IOException { + return DataCounts.PARSER.apply(parser, null); + } + + @Override + protected boolean supportsUnknownFields() { + return true; + } +} diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/common/FoldValuesTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/common/FoldValuesTests.java new file mode 100644 index 0000000000000..90d9219327648 --- /dev/null +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/common/FoldValuesTests.java @@ -0,0 +1,51 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.elasticsearch.client.ml.dataframe.stats.common; + +import org.elasticsearch.common.xcontent.XContentParser; +import org.elasticsearch.test.AbstractXContentTestCase; + +import java.io.IOException; + +public class FoldValuesTests extends AbstractXContentTestCase { + + @Override + protected boolean supportsUnknownFields() { + return true; + } + + @Override + protected FoldValues doParseInstance(XContentParser parser) throws IOException { + return FoldValues.PARSER.apply(parser, null); + } + + @Override + protected FoldValues createTestInstance() { + return createRandom(); + } + + public static FoldValues createRandom() { + int valuesSize = randomIntBetween(0, 10); + double[] values = new double[valuesSize]; + for (int i = 0; i < valuesSize; i++) { + values[i] = randomDouble(); + } + return new FoldValues(randomIntBetween(0, Integer.MAX_VALUE), values); + } +} diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/MemoryUsageTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/common/MemoryUsageTests.java similarity index 96% rename from client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/MemoryUsageTests.java rename to client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/common/MemoryUsageTests.java index 8e06db6f2b37f..0e27295752190 100644 --- a/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/MemoryUsageTests.java +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/common/MemoryUsageTests.java @@ -16,7 +16,7 @@ * specific language governing permissions and limitations * under the License. */ -package org.elasticsearch.client.ml.dataframe; +package org.elasticsearch.client.ml.dataframe.stats.common; import org.elasticsearch.common.xcontent.XContentParser; import org.elasticsearch.test.AbstractXContentTestCase; diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/outlierdetection/OutlierDetectionStatsTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/outlierdetection/OutlierDetectionStatsTests.java new file mode 100644 index 0000000000000..f40de67a62cd2 --- /dev/null +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/outlierdetection/OutlierDetectionStatsTests.java @@ -0,0 +1,51 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.elasticsearch.client.ml.dataframe.stats.outlierdetection; + +import org.elasticsearch.common.xcontent.XContentParser; +import org.elasticsearch.test.AbstractXContentTestCase; + +import java.io.IOException; +import java.time.Instant; + +public class OutlierDetectionStatsTests extends AbstractXContentTestCase { + + @Override + protected boolean supportsUnknownFields() { + return true; + } + + @Override + protected OutlierDetectionStats doParseInstance(XContentParser parser) throws IOException { + return OutlierDetectionStats.PARSER.apply(parser, null); + } + + @Override + protected OutlierDetectionStats createTestInstance() { + return createRandom(); + } + + public static OutlierDetectionStats createRandom() { + return new OutlierDetectionStats( + Instant.now(), + ParametersTests.createRandom(), + TimingStatsTests.createRandom() + ); + } +} diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/outlierdetection/ParametersTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/outlierdetection/ParametersTests.java new file mode 100644 index 0000000000000..4f566562683de --- /dev/null +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/outlierdetection/ParametersTests.java @@ -0,0 +1,53 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.elasticsearch.client.ml.dataframe.stats.outlierdetection; + +import org.elasticsearch.common.xcontent.XContentParser; +import org.elasticsearch.test.AbstractXContentTestCase; + +import java.io.IOException; + +public class ParametersTests extends AbstractXContentTestCase { + + @Override + protected boolean supportsUnknownFields() { + return true; + } + + @Override + protected Parameters doParseInstance(XContentParser parser) throws IOException { + return Parameters.PARSER.apply(parser, null); + } + + @Override + protected Parameters createTestInstance() { + return createRandom(); + } + + public static Parameters createRandom() { + return new Parameters( + randomBoolean() ? null : randomIntBetween(1, Integer.MAX_VALUE), + randomBoolean() ? null : randomAlphaOfLength(5), + randomBoolean() ? null : randomBoolean(), + randomBoolean() ? null : randomDouble(), + randomBoolean() ? null : randomDouble(), + randomBoolean() ? null : randomBoolean() + ); + } +} diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/outlierdetection/TimingStatsTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/outlierdetection/TimingStatsTests.java new file mode 100644 index 0000000000000..5483782e1d1cf --- /dev/null +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/outlierdetection/TimingStatsTests.java @@ -0,0 +1,48 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.elasticsearch.client.ml.dataframe.stats.outlierdetection; + +import org.elasticsearch.common.unit.TimeValue; +import org.elasticsearch.common.xcontent.XContentParser; +import org.elasticsearch.test.AbstractXContentTestCase; + +import java.io.IOException; + +public class TimingStatsTests extends AbstractXContentTestCase { + + @Override + protected boolean supportsUnknownFields() { + return true; + } + + + @Override + protected TimingStats doParseInstance(XContentParser parser) throws IOException { + return TimingStats.PARSER.apply(parser, null); + } + + @Override + protected TimingStats createTestInstance() { + return createRandom(); + } + + public static TimingStats createRandom() { + return new TimingStats(randomBoolean() ? null : TimeValue.timeValueMillis(randomNonNegativeLong())); + } +} diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/regression/HyperparametersTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/regression/HyperparametersTests.java new file mode 100644 index 0000000000000..43d0571bb206f --- /dev/null +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/regression/HyperparametersTests.java @@ -0,0 +1,62 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.elasticsearch.client.ml.dataframe.stats.regression; + +import org.elasticsearch.common.xcontent.XContentParser; +import org.elasticsearch.test.AbstractXContentTestCase; + +import java.io.IOException; + +public class HyperparametersTests extends AbstractXContentTestCase { + + @Override + protected boolean supportsUnknownFields() { + return true; + } + + @Override + protected Hyperparameters doParseInstance(XContentParser parser) throws IOException { + return Hyperparameters.PARSER.apply(parser, null); + } + + + @Override + protected Hyperparameters createTestInstance() { + return createRandom(); + } + + public static Hyperparameters createRandom() { + return new Hyperparameters( + randomBoolean() ? null : randomDouble(), + randomBoolean() ? null : randomDouble(), + randomBoolean() ? null : randomDouble(), + randomBoolean() ? null : randomDouble(), + randomBoolean() ? null : randomIntBetween(0, Integer.MAX_VALUE), + randomBoolean() ? null : randomIntBetween(0, Integer.MAX_VALUE), + randomBoolean() ? null : randomIntBetween(0, Integer.MAX_VALUE), + randomBoolean() ? null : randomIntBetween(0, Integer.MAX_VALUE), + randomBoolean() ? null : randomIntBetween(0, Integer.MAX_VALUE), + randomBoolean() ? null : randomDouble(), + randomBoolean() ? null : randomDouble(), + randomBoolean() ? null : randomDouble(), + randomBoolean() ? null : randomDouble(), + randomBoolean() ? null : randomDouble() + ); + } +} diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/regression/RegressionStatsTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/regression/RegressionStatsTests.java new file mode 100644 index 0000000000000..d4e784bb335cc --- /dev/null +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/regression/RegressionStatsTests.java @@ -0,0 +1,54 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.elasticsearch.client.ml.dataframe.stats.regression; + +import org.elasticsearch.common.xcontent.XContentParser; +import org.elasticsearch.test.AbstractXContentTestCase; + +import java.io.IOException; +import java.time.Instant; + +public class RegressionStatsTests extends AbstractXContentTestCase { + + @Override + protected boolean supportsUnknownFields() { + return true; + } + + @Override + protected RegressionStats doParseInstance(XContentParser parser) throws IOException { + return RegressionStats.PARSER.apply(parser, null); + } + + + @Override + protected RegressionStats createTestInstance() { + return createRandom(); + } + + public static RegressionStats createRandom() { + return new RegressionStats( + Instant.now(), + randomBoolean() ? null : randomIntBetween(1, Integer.MAX_VALUE), + HyperparametersTests.createRandom(), + TimingStatsTests.createRandom(), + ValidationLossTests.createRandom() + ); + } +} diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/regression/TimingStatsTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/regression/TimingStatsTests.java new file mode 100644 index 0000000000000..95fe6531f3b83 --- /dev/null +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/regression/TimingStatsTests.java @@ -0,0 +1,50 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.elasticsearch.client.ml.dataframe.stats.regression; + +import org.elasticsearch.common.unit.TimeValue; +import org.elasticsearch.common.xcontent.XContentParser; +import org.elasticsearch.test.AbstractXContentTestCase; + +import java.io.IOException; + +public class TimingStatsTests extends AbstractXContentTestCase { + + @Override + protected boolean supportsUnknownFields() { + return true; + } + + @Override + protected TimingStats doParseInstance(XContentParser parser) throws IOException { + return TimingStats.PARSER.apply(parser, null); + } + + @Override + protected TimingStats createTestInstance() { + return createRandom(); + } + + public static TimingStats createRandom() { + return new TimingStats( + randomBoolean() ? null : TimeValue.timeValueMillis(randomNonNegativeLong()), + randomBoolean() ? null : TimeValue.timeValueMillis(randomNonNegativeLong()) + ); + } +} diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/regression/ValidationLossTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/regression/ValidationLossTests.java new file mode 100644 index 0000000000000..d2a9f960bbbb2 --- /dev/null +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/ml/dataframe/stats/regression/ValidationLossTests.java @@ -0,0 +1,50 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.elasticsearch.client.ml.dataframe.stats.regression; + +import org.elasticsearch.client.ml.dataframe.stats.common.FoldValuesTests; +import org.elasticsearch.common.xcontent.XContentParser; +import org.elasticsearch.test.AbstractXContentTestCase; + +import java.io.IOException; + +public class ValidationLossTests extends AbstractXContentTestCase { + + @Override + protected boolean supportsUnknownFields() { + return true; + } + + @Override + protected ValidationLoss doParseInstance(XContentParser parser) throws IOException { + return ValidationLoss.PARSER.apply(parser, null); + } + + @Override + protected ValidationLoss createTestInstance() { + return createRandom(); + } + + public static ValidationLoss createRandom() { + return new ValidationLoss( + randomBoolean() ? null : randomAlphaOfLength(10), + randomBoolean() ? null : randomList(5, () -> FoldValuesTests.createRandom()) + ); + } +} diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/rollup/GetRollupJobResponseTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/rollup/GetRollupJobResponseTests.java index b866420a44c01..122f156986d71 100644 --- a/client/rest-high-level/src/test/java/org/elasticsearch/client/rollup/GetRollupJobResponseTests.java +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/rollup/GetRollupJobResponseTests.java @@ -64,8 +64,9 @@ private GetRollupJobResponse createTestInstance() { private RollupIndexerJobStats randomStats() { return new RollupIndexerJobStats(randomNonNegativeLong(), randomNonNegativeLong(), randomNonNegativeLong(), - randomNonNegativeLong(), randomNonNegativeLong(), randomNonNegativeLong(), - randomNonNegativeLong(), randomNonNegativeLong(), randomNonNegativeLong(), randomNonNegativeLong()); + randomNonNegativeLong(), randomNonNegativeLong(), randomNonNegativeLong(), randomNonNegativeLong(), + randomNonNegativeLong(), randomNonNegativeLong(), randomNonNegativeLong(), randomNonNegativeLong(), + randomNonNegativeLong()); } private RollupJobStatus randomStatus() { @@ -120,6 +121,8 @@ public void toXContent(RollupIndexerJobStats stats, XContentBuilder builder, ToX builder.field(IndexerJobStats.SEARCH_TIME_IN_MS.getPreferredName(), stats.getSearchTime()); builder.field(IndexerJobStats.SEARCH_TOTAL.getPreferredName(), stats.getSearchTotal()); builder.field(IndexerJobStats.SEARCH_FAILURES.getPreferredName(), stats.getSearchFailures()); + builder.field(IndexerJobStats.PROCESSING_TIME_IN_MS.getPreferredName(), stats.getProcessingTime()); + builder.field(IndexerJobStats.PROCESSING_TOTAL.getPreferredName(), stats.getProcessingTotal()); builder.endObject(); } diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/security/DeleteRoleMappingResponseTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/security/DeleteRoleMappingResponseTests.java index d89deb44e9f68..1eee41bc23685 100644 --- a/client/rest-high-level/src/test/java/org/elasticsearch/client/security/DeleteRoleMappingResponseTests.java +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/security/DeleteRoleMappingResponseTests.java @@ -35,15 +35,7 @@ public class DeleteRoleMappingResponseTests extends ESTestCase { public void testFromXContent() throws IOException { final String json = "{ \"found\" : \"true\" }"; final DeleteRoleMappingResponse response = DeleteRoleMappingResponse.fromXContent(XContentType.JSON.xContent().createParser( - new NamedXContentRegistry(Collections.emptyList()), new DeprecationHandler() { - @Override - public void usedDeprecatedName(String usedName, String modernName) { - } - - @Override - public void usedDeprecatedField(String usedName, String replacedWith) { - } - }, json)); + new NamedXContentRegistry(Collections.emptyList()), DeprecationHandler.IGNORE_DEPRECATIONS, json)); final DeleteRoleMappingResponse expectedResponse = new DeleteRoleMappingResponse(true); assertThat(response, equalTo(expectedResponse)); } diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/security/ExpressionRoleMappingTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/security/ExpressionRoleMappingTests.java index f30307ebde51a..3dd9d3ca5ee53 100644 --- a/client/rest-high-level/src/test/java/org/elasticsearch/client/security/ExpressionRoleMappingTests.java +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/security/ExpressionRoleMappingTests.java @@ -37,29 +37,21 @@ public class ExpressionRoleMappingTests extends ESTestCase { public void testExpressionRoleMappingParser() throws IOException { - final String json = - "{\n" + - " \"enabled\" : true,\n" + - " \"roles\" : [\n" + - " \"superuser\"\n" + - " ],\n" + - " \"rules\" : {\n" + - " \"field\" : {\n" + - " \"realm.name\" : \"kerb1\"\n" + - " }\n" + - " },\n" + - " \"metadata\" : { }\n" + + final String json = + "{\n" + + " \"enabled\" : true,\n" + + " \"roles\" : [\n" + + " \"superuser\"\n" + + " ],\n" + + " \"rules\" : {\n" + + " \"field\" : {\n" + + " \"realm.name\" : \"kerb1\"\n" + + " }\n" + + " },\n" + + " \"metadata\" : { }\n" + " }"; final ExpressionRoleMapping expressionRoleMapping = ExpressionRoleMapping.PARSER.parse(XContentType.JSON.xContent().createParser( - new NamedXContentRegistry(Collections.emptyList()), new DeprecationHandler() { - @Override - public void usedDeprecatedName(String usedName, String modernName) { - } - - @Override - public void usedDeprecatedField(String usedName, String replacedWith) { - } - }, json), "example-role-mapping"); + new NamedXContentRegistry(Collections.emptyList()), DeprecationHandler.IGNORE_DEPRECATIONS, json), "example-role-mapping"); final ExpressionRoleMapping expectedRoleMapping = new ExpressionRoleMapping("example-role-mapping", FieldRoleMapperExpression.ofKeyValues("realm.name", "kerb1"), singletonList("superuser"), Collections.emptyList(), diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/security/GetPrivilegesResponseTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/security/GetPrivilegesResponseTests.java index bf55e224095fe..ce17c9b1105d8 100644 --- a/client/rest-high-level/src/test/java/org/elasticsearch/client/security/GetPrivilegesResponseTests.java +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/security/GetPrivilegesResponseTests.java @@ -81,15 +81,7 @@ public void testFromXContent() throws IOException { "}"; final GetPrivilegesResponse response = GetPrivilegesResponse.fromXContent(XContentType.JSON.xContent().createParser( - new NamedXContentRegistry(Collections.emptyList()), new DeprecationHandler() { - @Override - public void usedDeprecatedName(String usedName, String modernName) { - } - - @Override - public void usedDeprecatedField(String usedName, String replacedWith) { - } - }, json)); + new NamedXContentRegistry(Collections.emptyList()), DeprecationHandler.IGNORE_DEPRECATIONS, json)); final ApplicationPrivilege readTestappPrivilege = new ApplicationPrivilege("testapp", "read", Arrays.asList("action:login", "data:read/*"), null); diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/security/GetRoleMappingsResponseTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/security/GetRoleMappingsResponseTests.java index 20883b859f9ae..4fda57f2ff631 100644 --- a/client/rest-high-level/src/test/java/org/elasticsearch/client/security/GetRoleMappingsResponseTests.java +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/security/GetRoleMappingsResponseTests.java @@ -36,42 +36,34 @@ public class GetRoleMappingsResponseTests extends ESTestCase { public void testFromXContent() throws IOException { - final String json = "{\n" + - " \"kerberosmapping\" : {\n" + - " \"enabled\" : true,\n" + - " \"roles\" : [\n" + - " \"superuser\"\n" + - " ],\n" + - " \"rules\" : {\n" + - " \"field\" : {\n" + - " \"realm.name\" : \"kerb1\"\n" + - " }\n" + - " },\n" + - " \"metadata\" : { }\n" + - " },\n" + - " \"ldapmapping\" : {\n" + - " \"enabled\" : false,\n" + - " \"roles\" : [\n" + - " \"monitoring\"\n" + - " ],\n" + - " \"rules\" : {\n" + - " \"field\" : {\n" + - " \"groups\" : \"cn=ipausers,cn=groups,cn=accounts,dc=ipademo,dc=local\"\n" + - " }\n" + - " },\n" + - " \"metadata\" : { }\n" + - " }\n" + + final String json = "{\n" + + " \"kerberosmapping\" : {\n" + + " \"enabled\" : true,\n" + + " \"roles\" : [\n" + + " \"superuser\"\n" + + " ],\n" + + " \"rules\" : {\n" + + " \"field\" : {\n" + + " \"realm.name\" : \"kerb1\"\n" + + " }\n" + + " },\n" + + " \"metadata\" : { }\n" + + " },\n" + + " \"ldapmapping\" : {\n" + + " \"enabled\" : false,\n" + + " \"roles\" : [\n" + + " \"monitoring\"\n" + + " ],\n" + + " \"rules\" : {\n" + + " \"field\" : {\n" + + " \"groups\" : \"cn=ipausers,cn=groups,cn=accounts,dc=ipademo,dc=local\"\n" + + " }\n" + + " },\n" + + " \"metadata\" : { }\n" + + " }\n" + "}"; final GetRoleMappingsResponse response = GetRoleMappingsResponse.fromXContent(XContentType.JSON.xContent().createParser( - new NamedXContentRegistry(Collections.emptyList()), new DeprecationHandler() { - @Override - public void usedDeprecatedName(String usedName, String modernName) { - } - - @Override - public void usedDeprecatedField(String usedName, String replacedWith) { - } - }, json)); + new NamedXContentRegistry(Collections.emptyList()), DeprecationHandler.IGNORE_DEPRECATIONS, json)); final List expectedRoleMappingsList = new ArrayList<>(); expectedRoleMappingsList.add(new ExpressionRoleMapping("kerberosmapping", FieldRoleMapperExpression.ofKeyValues("realm.name", "kerb1"), Collections.singletonList("superuser"), Collections.emptyList(), null, true)); diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/security/GetRolesResponseTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/security/GetRolesResponseTests.java index c4620fa1a2f3d..8be4320e4de7c 100644 --- a/client/rest-high-level/src/test/java/org/elasticsearch/client/security/GetRolesResponseTests.java +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/security/GetRolesResponseTests.java @@ -64,15 +64,7 @@ public void testFromXContent() throws IOException { " }\n" + "}"; final GetRolesResponse response = GetRolesResponse.fromXContent((XContentType.JSON.xContent().createParser( - new NamedXContentRegistry(Collections.emptyList()), new DeprecationHandler() { - @Override - public void usedDeprecatedName(String usedName, String modernName) { - } - - @Override - public void usedDeprecatedField(String usedName, String replacedWith) { - } - }, json))); + new NamedXContentRegistry(Collections.emptyList()), DeprecationHandler.IGNORE_DEPRECATIONS, json))); assertThat(response.getRoles().size(), equalTo(1)); assertThat(response.getTransientMetadataMap().size(), equalTo(1)); final Role role = response.getRoles().get(0); diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/security/support/expressiondsl/parser/RoleMapperExpressionParserTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/security/support/expressiondsl/parser/RoleMapperExpressionParserTests.java index 24ed5684fa856..1c9ccaa2f7490 100644 --- a/client/rest-high-level/src/test/java/org/elasticsearch/client/security/support/expressiondsl/parser/RoleMapperExpressionParserTests.java +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/security/support/expressiondsl/parser/RoleMapperExpressionParserTests.java @@ -115,15 +115,7 @@ private T checkExpressionType(RoleMapperExpression expr, Class type) { private RoleMapperExpression parse(String json) throws IOException { return new RoleMapperExpressionParser().parse("rules", XContentType.JSON.xContent().createParser(new NamedXContentRegistry( - Collections.emptyList()), new DeprecationHandler() { - @Override - public void usedDeprecatedName(String usedName, String modernName) { - } - - @Override - public void usedDeprecatedField(String usedName, String replacedWith) { - } - }, json)); + Collections.emptyList()), DeprecationHandler.IGNORE_DEPRECATIONS, json)); } } diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/security/user/privileges/ApplicationPrivilegeTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/security/user/privileges/ApplicationPrivilegeTests.java index b720187673023..532ff612fedc1 100644 --- a/client/rest-high-level/src/test/java/org/elasticsearch/client/security/user/privileges/ApplicationPrivilegeTests.java +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/security/user/privileges/ApplicationPrivilegeTests.java @@ -54,15 +54,7 @@ public void testFromXContentAndToXContent() throws IOException { + " }\n" + "}"; final ApplicationPrivilege privilege = ApplicationPrivilege.fromXContent(XContentType.JSON.xContent().createParser( - new NamedXContentRegistry(Collections.emptyList()), new DeprecationHandler() { - @Override - public void usedDeprecatedName(String usedName, String modernName) { - } - - @Override - public void usedDeprecatedField(String usedName, String replacedWith) { - } - }, json)); + new NamedXContentRegistry(Collections.emptyList()), DeprecationHandler.IGNORE_DEPRECATIONS, json)); final Map metadata = new HashMap<>(); metadata.put("description", "Read access to myapp"); final ApplicationPrivilege expectedPrivilege = diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/transform/PreviewTransformResponseTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/transform/PreviewTransformResponseTests.java index 8e1dbefa127a8..fcbc746de2263 100644 --- a/client/rest-high-level/src/test/java/org/elasticsearch/client/transform/PreviewTransformResponseTests.java +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/transform/PreviewTransformResponseTests.java @@ -19,47 +19,75 @@ package org.elasticsearch.client.transform; +import org.elasticsearch.action.admin.indices.alias.Alias; +import org.elasticsearch.client.indices.CreateIndexRequest; +import org.elasticsearch.common.bytes.BytesReference; +import org.elasticsearch.common.settings.Settings; +import org.elasticsearch.common.xcontent.ToXContent; import org.elasticsearch.common.xcontent.XContentBuilder; +import org.elasticsearch.common.xcontent.XContentFactory; +import org.elasticsearch.common.xcontent.XContentParser; +import org.elasticsearch.common.xcontent.XContentType; import org.elasticsearch.test.ESTestCase; import java.io.IOException; import java.util.ArrayList; import java.util.HashMap; +import java.util.HashSet; import java.util.List; import java.util.Map; +import java.util.Set; import static org.elasticsearch.test.AbstractXContentTestCase.xContentTester; +import static org.hamcrest.Matchers.equalTo; public class PreviewTransformResponseTests extends ESTestCase { public void testFromXContent() throws IOException { - xContentTester(this::createParser, - this::createTestInstance, - this::toXContent, - PreviewTransformResponse::fromXContent) - .supportsUnknownFields(true) - .randomFieldsExcludeFilter(path -> path.isEmpty() == false) - .test(); + xContentTester(this::createParser, this::createTestInstance, this::toXContent, PreviewTransformResponse::fromXContent) + .supportsUnknownFields(true) + .randomFieldsExcludeFilter(path -> path.isEmpty() == false) + .test(); } - private PreviewTransformResponse createTestInstance() { - int numDocs = randomIntBetween(5, 10); - List> docs = new ArrayList<>(numDocs); - for (int i=0; i doc = new HashMap<>(); - for (int j=0; j mappings = new HashMap<>(numMappingEntries); - for (int i = 0; i < numMappingEntries; i++) { - mappings.put(randomAlphaOfLength(10), Map.of("type", randomAlphaOfLength(10))); + public void testCreateIndexRequest() throws IOException { + PreviewTransformResponse previewResponse = randomPreviewResponse(); + + CreateIndexRequest createIndexRequest = previewResponse.getCreateIndexRequest("dest_index"); + assertEquals("dest_index", createIndexRequest.index()); + assertThat(createIndexRequest.aliases(), equalTo(previewResponse.getAliases())); + assertThat(createIndexRequest.settings(), equalTo(previewResponse.getSettings())); + + XContentBuilder builder = XContentFactory.contentBuilder(XContentType.JSON); + builder.map(previewResponse.getMappings()); + + assertThat(BytesReference.bytes(builder), equalTo(createIndexRequest.mappings())); + } + + public void testBWCPre77XContent() throws IOException { + PreviewTransformResponse response = randomPreviewResponse(); + + XContentBuilder builder = XContentFactory.jsonBuilder(); + + builder.startObject(); + builder.startArray("preview"); + for (Map doc : response.getDocs()) { + builder.map(doc); } + builder.endArray(); + builder.field("mappings", response.getGeneratedDestIndexSettings().getMappings()); + builder.endObject(); + XContentParser parser = createParser(builder); + PreviewTransformResponse oldResponse = PreviewTransformResponse.fromXContent(parser); - return new PreviewTransformResponse(docs, mappings); + assertThat(response.getDocs(), equalTo(oldResponse.getDocs())); + assertThat(response.getMappings(), equalTo(oldResponse.getMappings())); + assertTrue(oldResponse.getAliases().isEmpty()); + assertThat(oldResponse.getSettings(), equalTo(Settings.EMPTY)); + } + + private PreviewTransformResponse createTestInstance() { + return randomPreviewResponse(); } private void toXContent(PreviewTransformResponse response, XContentBuilder builder) throws IOException { @@ -69,7 +97,63 @@ private void toXContent(PreviewTransformResponse response, XContentBuilder build builder.map(doc); } builder.endArray(); - builder.field("mappings", response.getMappings()); + builder.startObject("generated_dest_index"); + builder.field("mappings", response.getGeneratedDestIndexSettings().getMappings()); + + builder.startObject("settings"); + response.getGeneratedDestIndexSettings().getSettings().toXContent(builder, ToXContent.EMPTY_PARAMS); + builder.endObject(); + + builder.startObject("aliases"); + for (Alias alias : response.getGeneratedDestIndexSettings().getAliases()) { + alias.toXContent(builder, ToXContent.EMPTY_PARAMS); + } + builder.endObject(); builder.endObject(); + builder.endObject(); + } + + private static PreviewTransformResponse randomPreviewResponse() { + int size = randomIntBetween(0, 10); + List> data = new ArrayList<>(size); + for (int i = 0; i < size; i++) { + data.add(Map.of(randomAlphaOfLength(10), Map.of("value1", randomIntBetween(1, 100)))); + } + + return new PreviewTransformResponse(data, randomGeneratedDestIndexSettings()); + } + + private static PreviewTransformResponse.GeneratedDestIndexSettings randomGeneratedDestIndexSettings() { + int size = randomIntBetween(0, 10); + + Map mappings = null; + if (randomBoolean()) { + mappings = new HashMap<>(size); + + for (int i = 0; i < size; i++) { + mappings.put(randomAlphaOfLength(10), Map.of("type", randomAlphaOfLength(10))); + } + } + + Settings settings = null; + if (randomBoolean()) { + Settings.Builder settingsBuilder = Settings.builder(); + size = randomIntBetween(0, 10); + for (int i = 0; i < size; i++) { + settingsBuilder.put(randomAlphaOfLength(10), randomBoolean()); + } + settings = settingsBuilder.build(); + } + + Set aliases = null; + if (randomBoolean()) { + aliases = new HashSet<>(); + size = randomIntBetween(0, 10); + for (int i = 0; i < size; i++) { + aliases.add(new Alias(randomAlphaOfLength(10))); + } + } + + return new PreviewTransformResponse.GeneratedDestIndexSettings(mappings, settings, aliases); } } diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/transform/hlrc/PreviewTransformResponseTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/transform/hlrc/PreviewTransformResponseTests.java new file mode 100644 index 0000000000000..1aed99f320247 --- /dev/null +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/transform/hlrc/PreviewTransformResponseTests.java @@ -0,0 +1,118 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ + +package org.elasticsearch.client.transform.hlrc; + +import org.elasticsearch.action.admin.indices.alias.Alias; +import org.elasticsearch.client.AbstractResponseTestCase; +import org.elasticsearch.client.transform.PreviewTransformResponse; +import org.elasticsearch.common.settings.Settings; +import org.elasticsearch.common.xcontent.XContentParser; +import org.elasticsearch.common.xcontent.XContentType; +import org.elasticsearch.xpack.core.transform.action.PreviewTransformAction; +import org.elasticsearch.xpack.core.transform.action.PreviewTransformAction.Response; +import org.elasticsearch.xpack.core.transform.transforms.TransformDestIndexSettings; + +import java.io.IOException; +import java.util.ArrayList; +import java.util.HashMap; +import java.util.HashSet; +import java.util.List; +import java.util.Map; +import java.util.Set; + +import static org.hamcrest.Matchers.equalTo; + +public class PreviewTransformResponseTests extends AbstractResponseTestCase< + PreviewTransformAction.Response, + org.elasticsearch.client.transform.PreviewTransformResponse> { + + public static Response randomPreviewResponse() { + int size = randomIntBetween(0, 10); + List> data = new ArrayList<>(size); + for (int i = 0; i < size; i++) { + data.add(Map.of(randomAlphaOfLength(10), Map.of("value1", randomIntBetween(1, 100)))); + } + + return new Response(data, randomGeneratedDestIndexSettings()); + } + + private static TransformDestIndexSettings randomGeneratedDestIndexSettings() { + int size = randomIntBetween(0, 10); + + Map mappings = null; + + if (randomBoolean()) { + mappings = new HashMap<>(size); + + for (int i = 0; i < size; i++) { + mappings.put(randomAlphaOfLength(10), Map.of("type", randomAlphaOfLength(10))); + } + } + + Settings settings = null; + if (randomBoolean()) { + Settings.Builder settingsBuilder = Settings.builder(); + size = randomIntBetween(0, 10); + for (int i = 0; i < size; i++) { + settingsBuilder.put(randomAlphaOfLength(10), randomBoolean()); + } + settings = settingsBuilder.build(); + } + + Set aliases = null; + + if (randomBoolean()) { + aliases = new HashSet<>(); + size = randomIntBetween(0, 10); + for (int i = 0; i < size; i++) { + aliases.add(new Alias(randomAlphaOfLength(10))); + } + } + + return new TransformDestIndexSettings(mappings, settings, aliases); + } + + @Override + protected Response createServerTestInstance(XContentType xContentType) { + return randomPreviewResponse(); + } + + @Override + protected PreviewTransformResponse doParseToClientInstance(XContentParser parser) throws IOException { + return org.elasticsearch.client.transform.PreviewTransformResponse.fromXContent(parser); + } + + @Override + protected void assertInstances(Response serverTestInstance, PreviewTransformResponse clientInstance) { + assertThat(serverTestInstance.getDocs(), equalTo(clientInstance.getDocs())); + assertThat( + serverTestInstance.getGeneratedDestIndexSettings().getAliases(), + equalTo(clientInstance.getGeneratedDestIndexSettings().getAliases()) + ); + assertThat( + serverTestInstance.getGeneratedDestIndexSettings().getMappings(), + equalTo(clientInstance.getGeneratedDestIndexSettings().getMappings()) + ); + assertThat( + serverTestInstance.getGeneratedDestIndexSettings().getSettings(), + equalTo(clientInstance.getGeneratedDestIndexSettings().getSettings()) + ); + } +} diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/transform/transforms/TransformIndexerStatsTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/transform/transforms/TransformIndexerStatsTests.java index 018cab89b0fc9..e06a7cddb93e9 100644 --- a/client/rest-high-level/src/test/java/org/elasticsearch/client/transform/transforms/TransformIndexerStatsTests.java +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/transform/transforms/TransformIndexerStatsTests.java @@ -31,41 +31,103 @@ public class TransformIndexerStatsTests extends ESTestCase { public void testFromXContent() throws IOException { xContentTester( - this::createParser, - TransformIndexerStatsTests::randomStats, - TransformIndexerStatsTests::toXContent, - TransformIndexerStats::fromXContent) - .supportsUnknownFields(true) - .test(); + this::createParser, + TransformIndexerStatsTests::randomStats, + TransformIndexerStatsTests::toXContent, + TransformIndexerStats::fromXContent + ).supportsUnknownFields(true).test(); } public static TransformIndexerStats randomStats() { - return new TransformIndexerStats(randomNonNegativeLong(), randomNonNegativeLong(), randomNonNegativeLong(), - randomNonNegativeLong(), randomNonNegativeLong(), randomNonNegativeLong(), - randomNonNegativeLong(), randomNonNegativeLong(), randomNonNegativeLong(), randomNonNegativeLong(), - randomBoolean() ? null : randomDouble(), - randomBoolean() ? null : randomDouble(), - randomBoolean() ? null : randomDouble()); + return new TransformIndexerStats( + randomNonNegativeLong(), + randomNonNegativeLong(), + randomNonNegativeLong(), + randomNonNegativeLong(), + randomNonNegativeLong(), + randomNonNegativeLong(), + randomNonNegativeLong(), + randomNonNegativeLong(), + randomNonNegativeLong(), + randomNonNegativeLong(), + randomNonNegativeLong(), + randomNonNegativeLong(), + randomDouble(), + randomDouble(), + randomDouble() + ); } public static void toXContent(TransformIndexerStats stats, XContentBuilder builder) throws IOException { builder.startObject(); - builder.field(IndexerJobStats.NUM_PAGES.getPreferredName(), stats.getNumPages()); - builder.field(IndexerJobStats.NUM_INPUT_DOCUMENTS.getPreferredName(), stats.getNumDocuments()); - builder.field(IndexerJobStats.NUM_OUTPUT_DOCUMENTS.getPreferredName(), stats.getOutputDocuments()); - builder.field(IndexerJobStats.NUM_INVOCATIONS.getPreferredName(), stats.getNumInvocations()); - builder.field(IndexerJobStats.INDEX_TIME_IN_MS.getPreferredName(), stats.getIndexTime()); - builder.field(IndexerJobStats.INDEX_TOTAL.getPreferredName(), stats.getIndexTotal()); - builder.field(IndexerJobStats.INDEX_FAILURES.getPreferredName(), stats.getIndexFailures()); - builder.field(IndexerJobStats.SEARCH_TIME_IN_MS.getPreferredName(), stats.getSearchTime()); - builder.field(IndexerJobStats.SEARCH_TOTAL.getPreferredName(), stats.getSearchTotal()); - builder.field(IndexerJobStats.SEARCH_FAILURES.getPreferredName(), stats.getSearchFailures()); - builder.field(TransformIndexerStats.EXPONENTIAL_AVG_CHECKPOINT_DURATION_MS.getPreferredName(), - stats.getExpAvgCheckpointDurationMs()); - builder.field(TransformIndexerStats.EXPONENTIAL_AVG_DOCUMENTS_INDEXED.getPreferredName(), - stats.getExpAvgDocumentsIndexed()); - builder.field(TransformIndexerStats.EXPONENTIAL_AVG_DOCUMENTS_PROCESSED.getPreferredName(), - stats.getExpAvgDocumentsProcessed()); + if (randomBoolean()) { + builder.field(IndexerJobStats.NUM_PAGES.getPreferredName(), stats.getNumPages()); + builder.field(IndexerJobStats.NUM_INPUT_DOCUMENTS.getPreferredName(), stats.getNumDocuments()); + builder.field(IndexerJobStats.NUM_OUTPUT_DOCUMENTS.getPreferredName(), stats.getOutputDocuments()); + builder.field(IndexerJobStats.NUM_INVOCATIONS.getPreferredName(), stats.getNumInvocations()); + builder.field(IndexerJobStats.INDEX_TIME_IN_MS.getPreferredName(), stats.getIndexTime()); + builder.field(IndexerJobStats.INDEX_TOTAL.getPreferredName(), stats.getIndexTotal()); + builder.field(IndexerJobStats.INDEX_FAILURES.getPreferredName(), stats.getIndexFailures()); + builder.field(IndexerJobStats.SEARCH_TIME_IN_MS.getPreferredName(), stats.getSearchTime()); + builder.field(IndexerJobStats.SEARCH_TOTAL.getPreferredName(), stats.getSearchTotal()); + builder.field(IndexerJobStats.PROCESSING_TIME_IN_MS.getPreferredName(), stats.getProcessingTime()); + builder.field(IndexerJobStats.PROCESSING_TOTAL.getPreferredName(), stats.getProcessingTotal()); + builder.field(IndexerJobStats.SEARCH_FAILURES.getPreferredName(), stats.getSearchFailures()); + builder.field( + TransformIndexerStats.EXPONENTIAL_AVG_CHECKPOINT_DURATION_MS.getPreferredName(), + stats.getExpAvgCheckpointDurationMs() + ); + builder.field(TransformIndexerStats.EXPONENTIAL_AVG_DOCUMENTS_INDEXED.getPreferredName(), stats.getExpAvgDocumentsIndexed()); + builder.field( + TransformIndexerStats.EXPONENTIAL_AVG_DOCUMENTS_PROCESSED.getPreferredName(), + stats.getExpAvgDocumentsProcessed() + ); + } else { + // a toXContent version which leaves out field with value 0 (simulating the case that an older version misses a field) + xContentFieldIfNotZero(builder, IndexerJobStats.NUM_PAGES.getPreferredName(), stats.getNumPages()); + xContentFieldIfNotZero(builder, IndexerJobStats.NUM_INPUT_DOCUMENTS.getPreferredName(), stats.getNumDocuments()); + xContentFieldIfNotZero(builder, IndexerJobStats.NUM_OUTPUT_DOCUMENTS.getPreferredName(), stats.getOutputDocuments()); + xContentFieldIfNotZero(builder, IndexerJobStats.NUM_INVOCATIONS.getPreferredName(), stats.getNumInvocations()); + xContentFieldIfNotZero(builder, IndexerJobStats.INDEX_TIME_IN_MS.getPreferredName(), stats.getIndexTime()); + xContentFieldIfNotZero(builder, IndexerJobStats.INDEX_TOTAL.getPreferredName(), stats.getIndexTotal()); + xContentFieldIfNotZero(builder, IndexerJobStats.INDEX_FAILURES.getPreferredName(), stats.getIndexFailures()); + xContentFieldIfNotZero(builder, IndexerJobStats.SEARCH_TIME_IN_MS.getPreferredName(), stats.getSearchTime()); + xContentFieldIfNotZero(builder, IndexerJobStats.SEARCH_TOTAL.getPreferredName(), stats.getSearchTotal()); + xContentFieldIfNotZero(builder, IndexerJobStats.PROCESSING_TIME_IN_MS.getPreferredName(), stats.getProcessingTime()); + xContentFieldIfNotZero(builder, IndexerJobStats.PROCESSING_TOTAL.getPreferredName(), stats.getProcessingTotal()); + xContentFieldIfNotZero(builder, IndexerJobStats.SEARCH_FAILURES.getPreferredName(), stats.getSearchFailures()); + xContentFieldIfNotZero( + builder, + TransformIndexerStats.EXPONENTIAL_AVG_CHECKPOINT_DURATION_MS.getPreferredName(), + stats.getExpAvgCheckpointDurationMs() + ); + xContentFieldIfNotZero( + builder, + TransformIndexerStats.EXPONENTIAL_AVG_DOCUMENTS_INDEXED.getPreferredName(), + stats.getExpAvgDocumentsIndexed() + ); + xContentFieldIfNotZero( + builder, + TransformIndexerStats.EXPONENTIAL_AVG_DOCUMENTS_PROCESSED.getPreferredName(), + stats.getExpAvgDocumentsProcessed() + ); + } builder.endObject(); } + + private static XContentBuilder xContentFieldIfNotZero(XContentBuilder builder, String name, long value) throws IOException { + if (value > 0) { + builder.field(name, value); + } + + return builder; + } + + private static XContentBuilder xContentFieldIfNotZero(XContentBuilder builder, String name, double value) throws IOException { + if (value > 0.0) { + builder.field(name, value); + } + + return builder; + } } diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/transform/transforms/hlrc/TransformIndexerStatsTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/transform/transforms/hlrc/TransformIndexerStatsTests.java index 50c98bcd8f0b6..eb74164be5364 100644 --- a/client/rest-high-level/src/test/java/org/elasticsearch/client/transform/transforms/hlrc/TransformIndexerStatsTests.java +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/transform/transforms/hlrc/TransformIndexerStatsTests.java @@ -30,17 +30,31 @@ public class TransformIndexerStatsTests extends AbstractResponseTestCase< org.elasticsearch.xpack.core.transform.transforms.TransformIndexerStats, - TransformIndexerStats> { + TransformIndexerStats> { + + public static org.elasticsearch.xpack.core.transform.transforms.TransformIndexerStats randomStats() { + return new org.elasticsearch.xpack.core.transform.transforms.TransformIndexerStats( + randomLongBetween(0L, 10000L), + randomLongBetween(0L, 10000L), + randomLongBetween(0L, 10000L), + randomLongBetween(0L, 10000L), + randomLongBetween(0L, 10000L), + randomLongBetween(0L, 10000L), + randomLongBetween(0L, 10000L), + randomLongBetween(0L, 10000L), + randomLongBetween(0L, 10000L), + randomLongBetween(0L, 10000L), + randomLongBetween(0L, 10000L), + randomLongBetween(0L, 10000L), + randomDouble(), + randomDouble(), + randomDouble() + ); + } @Override protected org.elasticsearch.xpack.core.transform.transforms.TransformIndexerStats createServerTestInstance(XContentType xContentType) { - return new org.elasticsearch.xpack.core.transform.transforms.TransformIndexerStats(randomLongBetween(10L, 10000L), - randomLongBetween(0L, 10000L), randomLongBetween(0L, 10000L), randomLongBetween(0L, 10000L), randomLongBetween(0L, 10000L), - randomLongBetween(0L, 10000L), randomLongBetween(0L, 10000L), randomLongBetween(0L, 10000L), randomLongBetween(0L, 10000L), - randomLongBetween(0L, 10000L), - randomBoolean() ? null : randomDouble(), - randomBoolean() ? null : randomDouble(), - randomBoolean() ? null : randomDouble()); + return randomStats(); } @Override @@ -49,8 +63,10 @@ protected TransformIndexerStats doParseToClientInstance(XContentParser parser) t } @Override - protected void assertInstances(org.elasticsearch.xpack.core.transform.transforms.TransformIndexerStats serverTestInstance, - TransformIndexerStats clientInstance) { + protected void assertInstances( + org.elasticsearch.xpack.core.transform.transforms.TransformIndexerStats serverTestInstance, + TransformIndexerStats clientInstance + ) { assertTransformIndexerStats(serverTestInstance, clientInstance); } } diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/transform/transforms/hlrc/TransformStatsTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/transform/transforms/hlrc/TransformStatsTests.java index 63f6bffdf6321..c732c425fe427 100644 --- a/client/rest-high-level/src/test/java/org/elasticsearch/client/transform/transforms/hlrc/TransformStatsTests.java +++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/transform/transforms/hlrc/TransformStatsTests.java @@ -36,40 +36,36 @@ import static org.hamcrest.Matchers.equalTo; -public class TransformStatsTests extends AbstractResponseTestCase { public static org.elasticsearch.xpack.core.transform.transforms.NodeAttributes randomNodeAttributes() { int numberOfAttributes = randomIntBetween(1, 10); Map attributes = new HashMap<>(numberOfAttributes); - for(int i = 0; i < numberOfAttributes; i++) { + for (int i = 0; i < numberOfAttributes; i++) { String val = randomAlphaOfLength(10); - attributes.put("key-"+i, val); + attributes.put("key-" + i, val); } - return new org.elasticsearch.xpack.core.transform.transforms.NodeAttributes(randomAlphaOfLength(10), + return new org.elasticsearch.xpack.core.transform.transforms.NodeAttributes( randomAlphaOfLength(10), randomAlphaOfLength(10), randomAlphaOfLength(10), - attributes); + randomAlphaOfLength(10), + attributes + ); } - public static org.elasticsearch.xpack.core.transform.transforms.TransformIndexerStats randomStats() { - return new org.elasticsearch.xpack.core.transform.transforms.TransformIndexerStats(randomLongBetween(10L, 10000L), - randomLongBetween(0L, 10000L), randomLongBetween(0L, 10000L), randomLongBetween(0L, 10000L), randomLongBetween(0L, 10000L), - randomLongBetween(0L, 10000L), randomLongBetween(0L, 10000L), randomLongBetween(0L, 10000L), randomLongBetween(0L, 10000L), - randomLongBetween(0L, 10000L), - randomBoolean() ? null : randomDouble(), - randomBoolean() ? null : randomDouble(), - randomBoolean() ? null : randomDouble()); - } @Override protected org.elasticsearch.xpack.core.transform.transforms.TransformStats createServerTestInstance(XContentType xContentType) { - return new org.elasticsearch.xpack.core.transform.transforms.TransformStats(randomAlphaOfLength(10), + return new org.elasticsearch.xpack.core.transform.transforms.TransformStats( + randomAlphaOfLength(10), randomFrom(org.elasticsearch.xpack.core.transform.transforms.TransformStats.State.values()), randomBoolean() ? null : randomAlphaOfLength(100), randomBoolean() ? null : randomNodeAttributes(), - randomStats(), - TransformCheckpointingInfoTests.randomTransformCheckpointingInfo()); + TransformIndexerStatsTests.randomStats(), + TransformCheckpointingInfoTests.randomTransformCheckpointingInfo() + ); } @Override @@ -78,8 +74,10 @@ protected TransformStats doParseToClientInstance(XContentParser parser) throws I } @Override - protected void assertInstances(org.elasticsearch.xpack.core.transform.transforms.TransformStats serverTestInstance, - TransformStats clientInstance) { + protected void assertInstances( + org.elasticsearch.xpack.core.transform.transforms.TransformStats serverTestInstance, + TransformStats clientInstance + ) { assertThat(serverTestInstance.getId(), equalTo(clientInstance.getId())); assertThat(serverTestInstance.getState().value(), equalTo(clientInstance.getState().value())); assertTransformIndexerStats(serverTestInstance.getIndexerStats(), clientInstance.getIndexerStats()); @@ -88,8 +86,10 @@ protected void assertInstances(org.elasticsearch.xpack.core.transform.transforms assertThat(serverTestInstance.getReason(), equalTo(clientInstance.getReason())); } - private void assertNodeAttributes(org.elasticsearch.xpack.core.transform.transforms.NodeAttributes serverTestInstance, - NodeAttributes clientInstance) { + private void assertNodeAttributes( + org.elasticsearch.xpack.core.transform.transforms.NodeAttributes serverTestInstance, + NodeAttributes clientInstance + ) { if (serverTestInstance == null || clientInstance == null) { assertNull(serverTestInstance); assertNull(clientInstance); @@ -102,8 +102,10 @@ private void assertNodeAttributes(org.elasticsearch.xpack.core.transform.transfo assertThat(serverTestInstance.getTransportAddress(), equalTo(clientInstance.getTransportAddress())); } - public static void assertTransformProgress(org.elasticsearch.xpack.core.transform.transforms.TransformProgress serverTestInstance, - TransformProgress clientInstance) { + public static void assertTransformProgress( + org.elasticsearch.xpack.core.transform.transforms.TransformProgress serverTestInstance, + TransformProgress clientInstance + ) { if (serverTestInstance == null || clientInstance == null) { assertNull(serverTestInstance); assertNull(clientInstance); @@ -115,16 +117,18 @@ public static void assertTransformProgress(org.elasticsearch.xpack.core.transfor assertThat(serverTestInstance.getDocumentsIndexed(), equalTo(clientInstance.getDocumentsIndexed())); } - public static void assertPosition(org.elasticsearch.xpack.core.transform.transforms.TransformIndexerPosition serverTestInstance, - TransformIndexerPosition clientInstance) { + public static void assertPosition( + org.elasticsearch.xpack.core.transform.transforms.TransformIndexerPosition serverTestInstance, + TransformIndexerPosition clientInstance + ) { assertThat(serverTestInstance.getIndexerPosition(), equalTo(clientInstance.getIndexerPosition())); assertThat(serverTestInstance.getBucketsPosition(), equalTo(clientInstance.getBucketsPosition())); } - public static void assertTransformCheckpointStats( - org.elasticsearch.xpack.core.transform.transforms.TransformCheckpointStats serverTestInstance, - TransformCheckpointStats clientInstance) { + org.elasticsearch.xpack.core.transform.transforms.TransformCheckpointStats serverTestInstance, + TransformCheckpointStats clientInstance + ) { assertTransformProgress(serverTestInstance.getCheckpointProgress(), clientInstance.getCheckpointProgress()); assertThat(serverTestInstance.getCheckpoint(), equalTo(clientInstance.getCheckpoint())); assertPosition(serverTestInstance.getPosition(), clientInstance.getPosition()); @@ -133,8 +137,9 @@ public static void assertTransformCheckpointStats( } public static void assertTransformCheckpointInfo( - org.elasticsearch.xpack.core.transform.transforms.TransformCheckpointingInfo serverTestInstance, - TransformCheckpointingInfo clientInstance) { + org.elasticsearch.xpack.core.transform.transforms.TransformCheckpointingInfo serverTestInstance, + TransformCheckpointingInfo clientInstance + ) { assertTransformCheckpointStats(serverTestInstance.getNext(), clientInstance.getNext()); assertTransformCheckpointStats(serverTestInstance.getLast(), clientInstance.getLast()); assertThat(serverTestInstance.getChangesLastDetectedAt(), equalTo(clientInstance.getChangesLastDetectedAt())); @@ -142,8 +147,9 @@ public static void assertTransformCheckpointInfo( } public static void assertTransformIndexerStats( - org.elasticsearch.xpack.core.transform.transforms.TransformIndexerStats serverTestInstance, - TransformIndexerStats clientInstance) { + org.elasticsearch.xpack.core.transform.transforms.TransformIndexerStats serverTestInstance, + TransformIndexerStats clientInstance + ) { assertThat(serverTestInstance.getExpAvgCheckpointDurationMs(), equalTo(clientInstance.getExpAvgCheckpointDurationMs())); assertThat(serverTestInstance.getExpAvgDocumentsProcessed(), equalTo(clientInstance.getExpAvgDocumentsProcessed())); assertThat(serverTestInstance.getExpAvgDocumentsIndexed(), equalTo(clientInstance.getExpAvgDocumentsIndexed())); diff --git a/client/sniffer/licenses/jackson-core-2.10.3.jar.sha1 b/client/sniffer/licenses/jackson-core-2.10.3.jar.sha1 new file mode 100644 index 0000000000000..f23937b0d82a4 --- /dev/null +++ b/client/sniffer/licenses/jackson-core-2.10.3.jar.sha1 @@ -0,0 +1 @@ +f7ee7b55c7d292ac72fbaa7648c089f069c938d2 \ No newline at end of file diff --git a/client/sniffer/licenses/jackson-core-2.8.11.jar.sha1 b/client/sniffer/licenses/jackson-core-2.8.11.jar.sha1 deleted file mode 100644 index e7ad1e74ed6b8..0000000000000 --- a/client/sniffer/licenses/jackson-core-2.8.11.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -876ead1db19f0c9e79c9789273a3ef8c6fd6c29b \ No newline at end of file diff --git a/distribution/archives/build.gradle b/distribution/archives/build.gradle index 029eb0733023f..969e92fda28ce 100644 --- a/distribution/archives/build.gradle +++ b/distribution/archives/build.gradle @@ -50,7 +50,7 @@ task createJvmOptionsDir(type: EmptyDirTask) { dirMode = 0750 } -CopySpec archiveFiles(CopySpec modulesFiles, String distributionType, String platform, boolean oss, boolean jdk) { +CopySpec archiveFiles(CopySpec modulesFiles, String distributionType, String platform, String architecture, boolean oss, boolean jdk) { return copySpec { into("elasticsearch-${version}") { into('lib') { @@ -70,7 +70,7 @@ CopySpec archiveFiles(CopySpec modulesFiles, String distributionType, String pla } if (jdk) { into("darwin".equals(platform) ? 'jdk.app' : 'jdk') { - with jdkFiles(project, platform) + with jdkFiles(project, platform, architecture) } } into('') { @@ -116,31 +116,31 @@ Closure commonZipConfig = { task buildIntegTestZip(type: Zip) { configure(commonZipConfig) - with archiveFiles(transportModulesFiles, 'zip', null, true, false) + with archiveFiles(transportModulesFiles, 'zip', null, 'x64', true, false) } task buildWindowsZip(type: Zip) { configure(commonZipConfig) archiveClassifier = 'windows-x86_64' - with archiveFiles(modulesFiles(false, 'windows'), 'zip', 'windows', false, true) + with archiveFiles(modulesFiles(false, 'windows'), 'zip', 'windows', 'x64', false, true) } task buildOssWindowsZip(type: Zip) { configure(commonZipConfig) archiveClassifier = 'windows-x86_64' - with archiveFiles(modulesFiles(true, 'windows'), 'zip', 'windows', true, true) + with archiveFiles(modulesFiles(true, 'windows'), 'zip', 'windows', 'x64', true, true) } task buildNoJdkWindowsZip(type: Zip) { configure(commonZipConfig) archiveClassifier = 'no-jdk-windows-x86_64' - with archiveFiles(modulesFiles(false, 'windows'), 'zip', 'windows', false, false) + with archiveFiles(modulesFiles(false, 'windows'), 'zip', 'windows', 'x64', false, false) } task buildOssNoJdkWindowsZip(type: Zip) { configure(commonZipConfig) archiveClassifier = 'no-jdk-windows-x86_64' - with archiveFiles(modulesFiles(true, 'windows'), 'zip', 'windows', true, false) + with archiveFiles(modulesFiles(true, 'windows'), 'zip', 'windows', 'x64', true, false) } Closure commonTarConfig = { @@ -153,49 +153,61 @@ Closure commonTarConfig = { task buildDarwinTar(type: SymbolicLinkPreservingTar) { configure(commonTarConfig) archiveClassifier = 'darwin-x86_64' - with archiveFiles(modulesFiles(false, 'darwin'), 'tar', 'darwin', false, true) + with archiveFiles(modulesFiles(false, 'darwin'), 'tar', 'darwin', 'x64', false, true) } task buildOssDarwinTar(type: SymbolicLinkPreservingTar) { configure(commonTarConfig) archiveClassifier = 'darwin-x86_64' - with archiveFiles(modulesFiles(true, 'darwin'), 'tar', 'darwin', true, true) + with archiveFiles(modulesFiles(true, 'darwin'), 'tar', 'darwin', 'x64', true, true) } task buildNoJdkDarwinTar(type: SymbolicLinkPreservingTar) { configure(commonTarConfig) archiveClassifier = 'no-jdk-darwin-x86_64' - with archiveFiles(modulesFiles(false, 'darwin'), 'tar', 'darwin', false, false) + with archiveFiles(modulesFiles(false, 'darwin'), 'tar', 'darwin', 'x64', false, false) } task buildOssNoJdkDarwinTar(type: SymbolicLinkPreservingTar) { configure(commonTarConfig) archiveClassifier = 'no-jdk-darwin-x86_64' - with archiveFiles(modulesFiles(true, 'darwin'), 'tar', 'darwin', true, false) + with archiveFiles(modulesFiles(true, 'darwin'), 'tar', 'darwin', 'x64', true, false) +} + +task buildLinuxAarch64Tar(type: SymbolicLinkPreservingTar) { + configure(commonTarConfig) + archiveClassifier = 'linux-aarch64' + with archiveFiles(modulesFiles(false, 'linux'), 'tar', 'linux', 'aarch64', false, true) } task buildLinuxTar(type: SymbolicLinkPreservingTar) { configure(commonTarConfig) archiveClassifier = 'linux-x86_64' - with archiveFiles(modulesFiles(false, 'linux'), 'tar', 'linux', false, true) + with archiveFiles(modulesFiles(false, 'linux'), 'tar', 'linux', 'x64', false, true) +} + +task buildOssLinuxAarch64Tar(type: SymbolicLinkPreservingTar) { + configure(commonTarConfig) + archiveClassifier = 'linux-aarch64' + with archiveFiles(modulesFiles(true, 'linux'), 'tar', 'linux', 'aarch64', true, true) } task buildOssLinuxTar(type: SymbolicLinkPreservingTar) { configure(commonTarConfig) archiveClassifier = 'linux-x86_64' - with archiveFiles(modulesFiles(true, 'linux'), 'tar', 'linux', true, true) + with archiveFiles(modulesFiles(true, 'linux'), 'tar', 'linux', 'x64', true, true) } task buildNoJdkLinuxTar(type: SymbolicLinkPreservingTar) { configure(commonTarConfig) archiveClassifier = 'no-jdk-linux-x86_64' - with archiveFiles(modulesFiles(false, 'linux'), 'tar', 'linux', false, false) + with archiveFiles(modulesFiles(false, 'linux'), 'tar', 'linux', 'x64', false, false) } task buildOssNoJdkLinuxTar(type: SymbolicLinkPreservingTar) { configure(commonTarConfig) archiveClassifier = 'no-jdk-linux-x86_64' - with archiveFiles(modulesFiles(true, 'linux'), 'tar', 'linux', true, false) + with archiveFiles(modulesFiles(true, 'linux'), 'tar', 'linux', 'x64', true, false) } Closure tarExists = { it -> new File('/bin/tar').exists() || new File('/usr/bin/tar').exists() || new File('/usr/local/bin/tar').exists() } diff --git a/distribution/archives/linux-aarch64-tar/build.gradle b/distribution/archives/linux-aarch64-tar/build.gradle new file mode 100644 index 0000000000000..4a6dde5fc0c92 --- /dev/null +++ b/distribution/archives/linux-aarch64-tar/build.gradle @@ -0,0 +1,2 @@ +// This file is intentionally blank. All configuration of the +// distribution is done in the parent project. diff --git a/distribution/archives/oss-linux-aarch64-tar/build.gradle b/distribution/archives/oss-linux-aarch64-tar/build.gradle new file mode 100644 index 0000000000000..4a6dde5fc0c92 --- /dev/null +++ b/distribution/archives/oss-linux-aarch64-tar/build.gradle @@ -0,0 +1,2 @@ +// This file is intentionally blank. All configuration of the +// distribution is done in the parent project. diff --git a/distribution/build.gradle b/distribution/build.gradle index 584b80a8c67de..5fc350f5441e1 100644 --- a/distribution/build.gradle +++ b/distribution/build.gradle @@ -391,16 +391,17 @@ configure(subprojects.findAll { ['archives', 'packages'].contains(it.name) }) { } } - jdkFiles = { Project project, String platform -> + jdkFiles = { Project project, String platform, String architecture -> project.jdks { - "bundled_${platform}" { + "bundled_${platform}_${architecture}" { it.platform = platform it.version = VersionProperties.getBundledJdk(platform) it.vendor = VersionProperties.bundledJdkVendor + it.architecture = architecture } } return copySpec { - from project.jdks."bundled_${platform}" + from project.jdks."bundled_${platform}_${architecture}" exclude "demo/**" eachFile { FileCopyDetails details -> if (details.relativePath.segments[-2] == 'bin' || details.relativePath.segments[-1] == 'jspawnhelper') { @@ -607,10 +608,13 @@ subprojects { ['archives:windows-zip', 'archives:oss-windows-zip', 'archives:darwin-tar', 'archives:oss-darwin-tar', + 'archives:linux-aarch64-tar', 'archives:oss-linux-aarch64-tar', 'archives:linux-tar', 'archives:oss-linux-tar', 'archives:integ-test-zip', 'packages:rpm', 'packages:deb', + 'packages:aarch64-rpm', 'packages:aarch64-deb', 'packages:oss-rpm', 'packages:oss-deb', + 'packages:aarch64-oss-rpm', 'packages:aarch64-oss-deb' ].forEach { subName -> Project subproject = project("${project.path}:${subName}") Configuration configuration = configurations.create(subproject.name) diff --git a/distribution/docker/aarch64-docker-build-context/build.gradle b/distribution/docker/aarch64-docker-build-context/build.gradle new file mode 100644 index 0000000000000..19b0bc3646c60 --- /dev/null +++ b/distribution/docker/aarch64-docker-build-context/build.gradle @@ -0,0 +1,11 @@ +apply plugin: 'base' + +task buildDockerBuildContext(type: Tar) { + extension = 'tar.gz' + compression = Compression.GZIP + archiveClassifier = "docker-build-context" + archiveBaseName = "elasticsearch-aarch64" + with dockerBuildContext("aarch64", false, false) +} + +assemble.dependsOn buildDockerBuildContext diff --git a/distribution/docker/aarch64-docker-export/build.gradle b/distribution/docker/aarch64-docker-export/build.gradle new file mode 100644 index 0000000000000..537b5a093683e --- /dev/null +++ b/distribution/docker/aarch64-docker-export/build.gradle @@ -0,0 +1,2 @@ +// This file is intentionally blank. All configuration of the +// export is done in the parent project. diff --git a/distribution/docker/aarch64-oss-docker-build-context/build.gradle b/distribution/docker/aarch64-oss-docker-build-context/build.gradle new file mode 100644 index 0000000000000..bea7d156803fd --- /dev/null +++ b/distribution/docker/aarch64-oss-docker-build-context/build.gradle @@ -0,0 +1,11 @@ +apply plugin: 'base' + +task buildOssDockerBuildContext(type: Tar) { + extension = 'tar.gz' + compression = Compression.GZIP + archiveClassifier = "docker-build-context" + archiveBaseName = "elasticsearch-aarch64-oss" + with dockerBuildContext("aarch64", true, false) +} + +assemble.dependsOn buildOssDockerBuildContext diff --git a/distribution/docker/aarch64-oss-docker-export/build.gradle b/distribution/docker/aarch64-oss-docker-export/build.gradle new file mode 100644 index 0000000000000..537b5a093683e --- /dev/null +++ b/distribution/docker/aarch64-oss-docker-export/build.gradle @@ -0,0 +1,2 @@ +// This file is intentionally blank. All configuration of the +// export is done in the parent project. diff --git a/distribution/docker/build.gradle b/distribution/docker/build.gradle index c33f74501da95..f67e520ce76a4 100644 --- a/distribution/docker/build.gradle +++ b/distribution/docker/build.gradle @@ -1,3 +1,4 @@ +import org.elasticsearch.gradle.Architecture import org.elasticsearch.gradle.ElasticsearchDistribution.Flavor import org.elasticsearch.gradle.LoggedExec import org.elasticsearch.gradle.VersionProperties @@ -13,38 +14,51 @@ testFixtures.useFixture() configurations { dockerPlugins + aarch64DockerSource dockerSource + aarch64OssDockerSource ossDockerSource } dependencies { + aarch64DockerSource project(path: ":distribution:archives:linux-aarch64-tar") dockerSource project(path: ":distribution:archives:linux-tar") + aarch64OssDockerSource project(path: ":distribution:archives:oss-linux-aarch64-tar") ossDockerSource project(path: ":distribution:archives:oss-linux-tar") } -ext.expansions = { oss, local -> - final String classifier = 'linux-x86_64' +ext.expansions = { architecture, oss, local -> + switch (architecture) { + case "aarch64": + case "x64": + break; + default: + throw new IllegalArgumentException("unrecongized architecture [" + architecture + "], must be one of (aarch64|x64)") + } + final String classifier = "aarch64".equals(architecture) ? "linux-aarch64" : "linux-x86_64" final String elasticsearch = oss ? "elasticsearch-oss-${VersionProperties.elasticsearch}-${classifier}.tar.gz" : "elasticsearch-${VersionProperties.elasticsearch}-${classifier}.tar.gz" return [ + 'base_image' : "aarch64".equals(architecture) ? "arm64v8/centos:7" : "centos:7", 'build_date' : BuildParams.buildDate, 'elasticsearch' : elasticsearch, 'git_revision' : BuildParams.gitRevision, 'license' : oss ? 'Apache-2.0' : 'Elastic-License', 'source_elasticsearch': local ? "COPY $elasticsearch /opt/" : "RUN cd /opt && curl --retry 8 -s -L -O https://artifacts.elastic.co/downloads/elasticsearch/${elasticsearch} && cd -", + 'tini_suffix' : "aarch64".equals(architecture) ? "-arm64" : "", 'version' : VersionProperties.elasticsearch ] } -private static String buildPath(final boolean oss) { - return "build/${oss ? 'oss-' : ''}docker" +private static String buildPath(final String architecture, final boolean oss) { + return "build/${"aarch64".equals(architecture) ? 'aarch64-' : ''}${oss ? 'oss-' : ''}docker" } -private static String taskName(final String prefix, final boolean oss, final String suffix) { - return "${prefix}${oss ? 'Oss' : ''}${suffix}" +private static String taskName(final String prefix, final String architecture, final boolean oss, final String suffix) { + return "${prefix}${"aarch64".equals(architecture) ? 'Aarch64' : ''}${oss ? 'Oss' : ''}${suffix}" } project.ext { - dockerBuildContext = { boolean oss, boolean local -> + dockerBuildContext = { String architecture, boolean oss, boolean local -> copySpec { into('bin') { from project.projectDir.toPath().resolve("src/docker/bin") @@ -62,25 +76,33 @@ project.ext { } from(project.projectDir.toPath().resolve("src/docker/Dockerfile")) { - expand(expansions(oss, local)) + expand(expansions(architecture, oss, local)) } } } } -void addCopyDockerContextTask(final boolean oss) { - task(taskName("copy", oss, "DockerContext"), type: Sync) { - expansions(oss, true).findAll { it.key != 'build_date' }.each { k, v -> +void addCopyDockerContextTask(final String architecture, final boolean oss) { + task(taskName("copy", architecture, oss, "DockerContext"), type: Sync) { + expansions(architecture, oss, true).findAll { it.key != 'build_date' }.each { k, v -> inputs.property(k, { v.toString() }) } - into buildPath(oss) + into buildPath(architecture, oss) - with dockerBuildContext(oss, true) + with dockerBuildContext(architecture, oss, true) - if (oss) { - from configurations.ossDockerSource + if ("aarch64".equals(architecture)) { + if (oss) { + from configurations.aarch64OssDockerSource + } else { + from configurations.aarch64DockerSource + } } else { - from configurations.dockerSource + if (oss) { + from configurations.ossDockerSource + } else { + from configurations.dockerSource + } } from configurations.dockerPlugins @@ -149,9 +171,9 @@ task integTest(type: Test) { check.dependsOn integTest -void addBuildDockerImage(final boolean oss) { - final Task buildDockerImageTask = task(taskName("build", oss, "DockerImage"), type: DockerBuildTask) { - TaskProvider copyContextTask = tasks.named(taskName("copy", oss, "DockerContext")) +void addBuildDockerImage(final String architecture, final boolean oss) { + final Task buildDockerImageTask = task(taskName("build", architecture, oss, "DockerImage"), type: DockerBuildTask) { + TaskProvider copyContextTask = tasks.named(taskName("copy", architecture, oss, "DockerContext")) dependsOn(copyContextTask) dockerContext.fileProvider(copyContextTask.map { it.destinationDir }) @@ -169,12 +191,15 @@ void addBuildDockerImage(final boolean oss) { ] } } + buildDockerImageTask.onlyIf { Architecture.current().name().toLowerCase().equals(architecture) } assemble.dependsOn(buildDockerImageTask) } -for (final boolean oss : [false, true]) { - addCopyDockerContextTask(oss) - addBuildDockerImage(oss) +for (final String architecture : ["aarch64", "x64"]) { + for (final boolean oss : [false, true]) { + addCopyDockerContextTask(architecture, oss) + addBuildDockerImage(architecture, oss) + } } // We build the images used in compose locally, but the pull command insists on using a repository @@ -192,11 +217,12 @@ subprojects { Project subProject -> if (subProject.name.contains('docker-export')) { apply plugin: 'distribution' + final String architecture = subProject.name.contains('aarch64-') ? 'aarch64' : 'x64' final boolean oss = subProject.name.contains('oss-') - def exportTaskName = taskName("export", oss, "DockerImage") - def buildTaskName = taskName("build", oss, "DockerImage") - def tarFile = "${parent.projectDir}/build/elasticsearch${oss ? '-oss' : ''}_test.${VersionProperties.elasticsearch}.docker.tar" + def exportTaskName = taskName("export", architecture, oss, "DockerImage") + def buildTaskName = taskName("build", architecture, oss, "DockerImage") + def tarFile = "${parent.projectDir}/build/elasticsearch${"aarch64".equals(architecture) ? '-aarch64' : ''}${oss ? '-oss' : ''}_test.${VersionProperties.elasticsearch}.docker.tar" final Task exportDockerImageTask = task(exportTaskName, type: LoggedExec) { inputs.file("${parent.projectDir}/build/markers/${buildTaskName}.marker") @@ -212,7 +238,7 @@ subprojects { Project subProject -> artifacts.add('default', file(tarFile)) { type 'tar' - name "elasticsearch${oss ? '-oss' : ''}" + name "elasticsearch${"aarch64".equals(architecture) ? '-aarch64' : ''}${oss ? '-oss' : ''}" builtBy exportTaskName } diff --git a/distribution/docker/docker-build-context/build.gradle b/distribution/docker/docker-build-context/build.gradle index 50be407e566bc..2dd28329d7ba5 100644 --- a/distribution/docker/docker-build-context/build.gradle +++ b/distribution/docker/docker-build-context/build.gradle @@ -5,7 +5,7 @@ task buildDockerBuildContext(type: Tar) { compression = Compression.GZIP archiveClassifier = "docker-build-context" archiveBaseName = "elasticsearch" - with dockerBuildContext(false, false) + with dockerBuildContext("x64", false, false) } assemble.dependsOn buildDockerBuildContext diff --git a/distribution/docker/oss-docker-build-context/build.gradle b/distribution/docker/oss-docker-build-context/build.gradle index b69f7dc620f53..0a29c2a2b7274 100644 --- a/distribution/docker/oss-docker-build-context/build.gradle +++ b/distribution/docker/oss-docker-build-context/build.gradle @@ -5,7 +5,7 @@ task buildOssDockerBuildContext(type: Tar) { compression = Compression.GZIP archiveClassifier = "docker-build-context" archiveBaseName = "elasticsearch-oss" - with dockerBuildContext(true, false) + with dockerBuildContext("x64", true, false) } assemble.dependsOn buildOssDockerBuildContext diff --git a/distribution/docker/src/docker/Dockerfile b/distribution/docker/src/docker/Dockerfile index b12a3c7518265..7eb62b7d8ad1b 100644 --- a/distribution/docker/src/docker/Dockerfile +++ b/distribution/docker/src/docker/Dockerfile @@ -11,7 +11,7 @@ # Set gid=0 and make group perms==owner perms ################################################################################ -FROM centos:7 AS builder +FROM ${base_image} AS builder RUN for iter in {1..10}; do yum update --setopt=tsflags=nodocs -y && \ yum install --setopt=tsflags=nodocs -y gzip shadow-utils tar && \ @@ -42,8 +42,8 @@ RUN chmod 0660 config/elasticsearch.yml config/log4j2.properties # gpg, but the keyservers are slow to return the key and this can fail the # build. Instead, we check the binary against a checksum that we have # computed. -ADD https://github.com/krallin/tini/releases/download/v0.18.0/tini /tini -COPY config/tini.sha512 /tini.sha512 +ADD https://github.com/krallin/tini/releases/download/v0.18.0/tini${tini_suffix} /tini +COPY config/tini${tini_suffix}.sha512 /tini.sha512 RUN sha512sum -c /tini.sha512 && chmod +x /tini ################################################################################ @@ -52,7 +52,7 @@ RUN sha512sum -c /tini.sha512 && chmod +x /tini # Add entrypoint ################################################################################ -FROM centos:7 +FROM ${base_image} ENV ELASTIC_CONTAINER true diff --git a/distribution/docker/src/docker/config/tini-arm64.sha512 b/distribution/docker/src/docker/config/tini-arm64.sha512 new file mode 100644 index 0000000000000..274eaa28cff08 --- /dev/null +++ b/distribution/docker/src/docker/config/tini-arm64.sha512 @@ -0,0 +1 @@ +6ae5147e522e484b9d59b0caa04e6dadf0efe332b272039c7cf5951e39f5028e9852c3c4bcdd46b98977329108d555ee7ea55f9eca99765d05922ec7aff837d8 /tini diff --git a/distribution/packages/aarch64-deb/build.gradle b/distribution/packages/aarch64-deb/build.gradle new file mode 100644 index 0000000000000..4a6dde5fc0c92 --- /dev/null +++ b/distribution/packages/aarch64-deb/build.gradle @@ -0,0 +1,2 @@ +// This file is intentionally blank. All configuration of the +// distribution is done in the parent project. diff --git a/distribution/packages/aarch64-oss-deb/build.gradle b/distribution/packages/aarch64-oss-deb/build.gradle new file mode 100644 index 0000000000000..4a6dde5fc0c92 --- /dev/null +++ b/distribution/packages/aarch64-oss-deb/build.gradle @@ -0,0 +1,2 @@ +// This file is intentionally blank. All configuration of the +// distribution is done in the parent project. diff --git a/distribution/packages/aarch64-oss-rpm/build.gradle b/distribution/packages/aarch64-oss-rpm/build.gradle new file mode 100644 index 0000000000000..4a6dde5fc0c92 --- /dev/null +++ b/distribution/packages/aarch64-oss-rpm/build.gradle @@ -0,0 +1,2 @@ +// This file is intentionally blank. All configuration of the +// distribution is done in the parent project. diff --git a/distribution/packages/aarch64-rpm/build.gradle b/distribution/packages/aarch64-rpm/build.gradle new file mode 100644 index 0000000000000..4a6dde5fc0c92 --- /dev/null +++ b/distribution/packages/aarch64-rpm/build.gradle @@ -0,0 +1,2 @@ +// This file is intentionally blank. All configuration of the +// distribution is done in the parent project. diff --git a/distribution/packages/build.gradle b/distribution/packages/build.gradle index 9b1efa3e43d7c..105f244e6bc5e 100644 --- a/distribution/packages/build.gradle +++ b/distribution/packages/build.gradle @@ -98,17 +98,17 @@ addProcessFilesTask('rpm', false, false) // Common configuration that is package dependent. This can't go in ospackage // since we have different templated files that need to be consumed, but the structure // is the same -Closure commonPackageConfig(String type, boolean oss, boolean jdk) { +Closure commonPackageConfig(String type, boolean oss, boolean jdk, String architecture) { return { onlyIf { OS.current().equals(OS.WINDOWS) == false } dependsOn "process${oss ? 'Oss' : ''}${jdk ? '' : 'NoJdk'}${type.capitalize()}Files" packageName "elasticsearch${oss ? '-oss' : ''}" - arch(type == 'deb' ? 'amd64' : 'X86_64') + arch(architecture == 'aarch64' ? 'aarch64' : type == 'deb' ? 'amd64' : 'X86_64') // Follow elasticsearch's file naming convention String jdkString = jdk ? "" : "no-jdk-" - String prefix = "${oss ? 'oss-' : ''}${jdk ? '' : 'no-jdk-'}${type}" + String prefix = "${architecture == 'aarch64' ? 'aarch64-' : ''}${oss ? 'oss-' : ''}${jdk ? '' : 'no-jdk-'}${type}" destinationDir = file("${prefix}/build/distributions") // SystemPackagingTask overrides default archive task convention mappings, but doesn't provide a setter so we have to override the convention mapping itself @@ -143,7 +143,7 @@ Closure commonPackageConfig(String type, boolean oss, boolean jdk) { } if (jdk) { into('jdk') { - with jdkFiles(project, 'linux') + with jdkFiles(project, 'linux', architecture) } } // we need to specify every intermediate directory in these paths so the package managers know they are explicitly @@ -306,9 +306,9 @@ ospackage { into '/usr/share/elasticsearch' } -Closure commonDebConfig(boolean oss, boolean jdk) { +Closure commonDebConfig(boolean oss, boolean jdk, String architecture) { return { - configure(commonPackageConfig('deb', oss, jdk)) + configure(commonPackageConfig('deb', oss, jdk, architecture)) // jdeb does not provide a way to set the License control attribute, and ospackage // silently ignores setting it. Instead, we set the license as "custom field" @@ -336,25 +336,33 @@ Closure commonDebConfig(boolean oss, boolean jdk) { } } +task buildAarch64Deb(type: Deb) { + configure(commonDebConfig(false, true, 'aarch64')) +} + task buildDeb(type: Deb) { - configure(commonDebConfig(false, true)) + configure(commonDebConfig(false, true, 'x64')) +} + +task buildAarch64OssDeb(type: Deb) { + configure(commonDebConfig(true, true, 'aarch64')) } task buildOssDeb(type: Deb) { - configure(commonDebConfig(true, true)) + configure(commonDebConfig(true, true, 'x64')) } task buildNoJdkDeb(type: Deb) { - configure(commonDebConfig(false, false)) + configure(commonDebConfig(false, false, 'x64')) } task buildOssNoJdkDeb(type: Deb) { - configure(commonDebConfig(true, false)) + configure(commonDebConfig(true, false, 'x64')) } -Closure commonRpmConfig(boolean oss, boolean jdk) { +Closure commonRpmConfig(boolean oss, boolean jdk, String architecture) { return { - configure(commonPackageConfig('rpm', oss, jdk)) + configure(commonPackageConfig('rpm', oss, jdk, architecture)) if (oss) { license 'ASL 2.0' @@ -381,20 +389,28 @@ Closure commonRpmConfig(boolean oss, boolean jdk) { } } +task buildAarch64Rpm(type: Rpm) { + configure(commonRpmConfig(false, true, 'aarch64')) +} + task buildRpm(type: Rpm) { - configure(commonRpmConfig(false, true)) + configure(commonRpmConfig(false, true, 'x64')) +} + +task buildAarch64OssRpm(type: Rpm) { + configure(commonRpmConfig(true, true, 'aarch64')) } task buildOssRpm(type: Rpm) { - configure(commonRpmConfig(true, true)) + configure(commonRpmConfig(true, true, 'x64')) } task buildNoJdkRpm(type: Rpm) { - configure(commonRpmConfig(false, false)) + configure(commonRpmConfig(false, false, 'x64')) } task buildOssNoJdkRpm(type: Rpm) { - configure(commonRpmConfig(true, false)) + configure(commonRpmConfig(true, false, 'x64')) } Closure dpkgExists = { it -> new File('/bin/dpkg-deb').exists() || new File('/usr/bin/dpkg-deb').exists() || new File('/usr/local/bin/dpkg-deb').exists() } @@ -446,6 +462,8 @@ subprojects { final File rpmDatabase = new File(extractionDir, 'rpm-database') commandLine 'rpm', '--badreloc', + '--ignorearch', + '--ignoreos', '--nodeps', '--noscripts', '--notriggers', diff --git a/distribution/src/bin/elasticsearch b/distribution/src/bin/elasticsearch index 136aed6755c5e..e2c2288cb664c 100755 --- a/distribution/src/bin/elasticsearch +++ b/distribution/src/bin/elasticsearch @@ -29,7 +29,7 @@ for option in "$@"; do done if [ -z "$ES_TMPDIR" ]; then - ES_TMPDIR=`"$JAVA" -cp "$ES_CLASSPATH" org.elasticsearch.tools.launchers.TempDirectory` + ES_TMPDIR=`"$JAVA" "$XSHARE" -cp "$ES_CLASSPATH" org.elasticsearch.tools.launchers.TempDirectory` fi # get keystore password before setting java options to avoid @@ -52,12 +52,13 @@ fi # - second, JVM options are read from jvm.options and jvm.options.d/*.options # - third, JVM options from ES_JAVA_OPTS are applied # - fourth, ergonomic JVM options are applied -ES_JAVA_OPTS=`export ES_TMPDIR; "$JAVA" -cp "$ES_CLASSPATH" org.elasticsearch.tools.launchers.JvmOptionsParser "$ES_PATH_CONF"` +ES_JAVA_OPTS=`export ES_TMPDIR; "$JAVA" "$XSHARE" -cp "$ES_CLASSPATH" org.elasticsearch.tools.launchers.JvmOptionsParser "$ES_PATH_CONF"` # manual parsing to find out, if process should be detached if [[ $DAEMONIZE = false ]]; then exec \ "$JAVA" \ + "$XSHARE" \ $ES_JAVA_OPTS \ -Des.path.home="$ES_HOME" \ -Des.path.conf="$ES_PATH_CONF" \ @@ -70,6 +71,7 @@ if [[ $DAEMONIZE = false ]]; then else exec \ "$JAVA" \ + "$XSHARE" \ $ES_JAVA_OPTS \ -Des.path.home="$ES_HOME" \ -Des.path.conf="$ES_PATH_CONF" \ diff --git a/distribution/src/bin/elasticsearch-cli b/distribution/src/bin/elasticsearch-cli index 4af827b67caf9..6f03456eb0122 100644 --- a/distribution/src/bin/elasticsearch-cli +++ b/distribution/src/bin/elasticsearch-cli @@ -22,6 +22,7 @@ ES_JAVA_OPTS="-Xms4m -Xmx64m -XX:+UseSerialGC ${ES_JAVA_OPTS}" exec \ "$JAVA" \ + "$XSHARE" \ $ES_JAVA_OPTS \ -Des.path.home="$ES_HOME" \ -Des.path.conf="$ES_PATH_CONF" \ diff --git a/distribution/src/bin/elasticsearch-env b/distribution/src/bin/elasticsearch-env index cbdfbf8facb5c..5a54ad58e0abd 100644 --- a/distribution/src/bin/elasticsearch-env +++ b/distribution/src/bin/elasticsearch-env @@ -67,8 +67,14 @@ if [ ! -z "$JAVA_OPTS" ]; then echo "pass JVM parameters via ES_JAVA_OPTS" fi +if [[ "$("$JAVA" -version 2>/dev/null)" =~ "Unable to map CDS archive" ]]; then + XSHARE="-Xshare:off" +else + XSHARE="-Xshare:auto" +fi + # check the Java version -"$JAVA" -cp "$ES_CLASSPATH" org.elasticsearch.tools.java_version_checker.JavaVersionChecker +"$JAVA" "$XSHARE" -cp "$ES_CLASSPATH" org.elasticsearch.tools.java_version_checker.JavaVersionChecker export HOSTNAME=$HOSTNAME diff --git a/distribution/tools/launchers/src/main/java/org/elasticsearch/tools/launchers/JvmErgonomics.java b/distribution/tools/launchers/src/main/java/org/elasticsearch/tools/launchers/JvmErgonomics.java index 9ed0479c4d15b..0b2e9f9092a74 100644 --- a/distribution/tools/launchers/src/main/java/org/elasticsearch/tools/launchers/JvmErgonomics.java +++ b/distribution/tools/launchers/src/main/java/org/elasticsearch/tools/launchers/JvmErgonomics.java @@ -87,6 +87,7 @@ private static List flagsFinal(final List userDefinedJvmOptions) final List command = Stream.of( Stream.of(java), userDefinedJvmOptions.stream(), + Stream.of("-Xshare:off"), Stream.of("-XX:+PrintFlagsFinal"), Stream.of("-version") ).reduce(Stream::concat).get().collect(Collectors.toUnmodifiableList()); diff --git a/docs/plugins/analysis-nori.asciidoc b/docs/plugins/analysis-nori.asciidoc index 7cc04c9c3de75..1e5f998a72da4 100644 --- a/docs/plugins/analysis-nori.asciidoc +++ b/docs/plugins/analysis-nori.asciidoc @@ -54,6 +54,10 @@ It can be set to: 가곡역 => 가곡역, 가곡, 역 -- +`discard_punctuation`:: + + Whether punctuation should be discarded from the output. Defaults to `true`. + `user_dictionary`:: + -- @@ -99,6 +103,7 @@ PUT nori_sample "nori_user_dict": { "type": "nori_tokenizer", "decompound_mode": "mixed", + "discard_punctuation": "false", "user_dictionary": "userdict_ko.txt" } }, @@ -434,3 +439,107 @@ Which responds with: -------------------------------------------------- <1> The Hanja form is replaced by the Hangul translation. + + +[[analysis-nori-number]] +==== `nori_number` token filter + +The `nori_number` token filter normalizes Korean numbers +to regular Arabic decimal numbers in half-width characters. + +Korean numbers are often written using a combination of Hangul and Arabic numbers with various kinds punctuation. +For example, 3.2천 means 3200. +This filter does this kind of normalization and allows a search for 3200 to match 3.2천 in text, +but can also be used to make range facets based on the normalized numbers and so on. + +[NOTE] +==== +Notice that this analyzer uses a token composition scheme and relies on punctuation tokens +being found in the token stream. +Please make sure your `nori_tokenizer` has `discard_punctuation` set to false. +In case punctuation characters, such as U+FF0E(.), is removed from the token stream, +this filter would find input tokens 3 and 2천 and give outputs 3 and 2000 instead of 3200, +which is likely not the intended result. + +If you want to remove punctuation characters from your index that are not part of normalized numbers, +add a `stop` token filter with the punctuation you wish to remove after `nori_number` in your analyzer chain. +==== +Below are some examples of normalizations this filter supports. +The input is untokenized text and the result is the single term attribute emitted for the input. + +- 영영칠 -> 7 +- 일영영영 -> 1000 +- 삼천2백2십삼 -> 3223 +- 조육백만오천일 -> 1000006005001 +- 3.2천 -> 3200 +- 1.2만345.67 -> 12345.67 +- 4,647.100 -> 4647.1 +- 15,7 -> 157 (be aware of this weakness) + +For example: + +[source,console] +-------------------------------------------------- +PUT nori_sample +{ + "settings": { + "index": { + "analysis": { + "analyzer": { + "my_analyzer": { + "tokenizer": "tokenizer_discard_puncuation_false", + "filter": [ + "part_of_speech_stop_sp", "nori_number" + ] + } + }, + "tokenizer": { + "tokenizer_discard_puncuation_false": { + "type": "nori_tokenizer", + "discard_punctuation": "false" + } + }, + "filter": { + "part_of_speech_stop_sp": { + "type": "nori_part_of_speech", + "stoptags": ["SP"] + } + } + } + } + } +} + +GET nori_sample/_analyze +{ + "analyzer": "my_analyzer", + "text": "십만이천오백과 3.2천" +} +-------------------------------------------------- + +Which results in: + +[source,console-result] +-------------------------------------------------- +{ + "tokens" : [{ + "token" : "102500", + "start_offset" : 0, + "end_offset" : 6, + "type" : "word", + "position" : 0 + }, { + "token" : "과", + "start_offset" : 6, + "end_offset" : 7, + "type" : "word", + "position" : 1 + }, { + "token" : "3200", + "start_offset" : 8, + "end_offset" : 12, + "type" : "word", + "position" : 2 + }] +} +-------------------------------------------------- diff --git a/docs/reference/aggregations/metrics/top-metrics-aggregation.asciidoc b/docs/reference/aggregations/metrics/top-metrics-aggregation.asciidoc index 6dbc8adfd3ff8..e0ab0afde2f3a 100644 --- a/docs/reference/aggregations/metrics/top-metrics-aggregation.asciidoc +++ b/docs/reference/aggregations/metrics/top-metrics-aggregation.asciidoc @@ -50,7 +50,7 @@ faster. The `sort` field in the metric request functions exactly the same as the `sort` field in the <> request except: -* It can't be used on <>, <, <>, +* It can't be used on <>, <>, <>, <>, or <> fields. * It only supports a single sort value so which document wins ties is not specified. diff --git a/docs/reference/analysis/anatomy.asciidoc b/docs/reference/analysis/anatomy.asciidoc index 1db14e787a54a..22e7ffda667d4 100644 --- a/docs/reference/analysis/anatomy.asciidoc +++ b/docs/reference/analysis/anatomy.asciidoc @@ -10,6 +10,7 @@ blocks into analyzers suitable for different languages and types of text. Elasticsearch also exposes the individual building blocks so that they can be combined to define new <> analyzers. +[[analyzer-anatomy-character-filters]] ==== Character filters A _character filter_ receives the original text as a stream of characters and @@ -21,6 +22,7 @@ elements like `` from the stream. An analyzer may have *zero or more* <>, which are applied in order. +[[analyzer-anatomy-tokenizer]] ==== Tokenizer A _tokenizer_ receives a stream of characters, breaks it up into individual @@ -35,6 +37,7 @@ the term represents. An analyzer must have *exactly one* <>. +[[analyzer-anatomy-token-filters]] ==== Token filters A _token filter_ receives the token stream and may add, remove, or change diff --git a/docs/reference/analysis/concepts.asciidoc b/docs/reference/analysis/concepts.asciidoc index 2468286e3a719..2e431efcd5fec 100644 --- a/docs/reference/analysis/concepts.asciidoc +++ b/docs/reference/analysis/concepts.asciidoc @@ -8,6 +8,8 @@ This section explains the fundamental concepts of text analysis in {es}. * <> * <> +* <> include::anatomy.asciidoc[] -include::index-search-time.asciidoc[] \ No newline at end of file +include::index-search-time.asciidoc[] +include::token-graphs.asciidoc[] \ No newline at end of file diff --git a/docs/reference/analysis/testing.asciidoc b/docs/reference/analysis/testing.asciidoc index ba3300802ac87..845f275455eb2 100644 --- a/docs/reference/analysis/testing.asciidoc +++ b/docs/reference/analysis/testing.asciidoc @@ -55,7 +55,7 @@ The API returns the following response: You can also test combinations of: * A tokenizer -* Zero or token filters +* Zero or more token filters * Zero or more character filters [source,console] diff --git a/docs/reference/analysis/token-graphs.asciidoc b/docs/reference/analysis/token-graphs.asciidoc new file mode 100644 index 0000000000000..ab1dc52f5131b --- /dev/null +++ b/docs/reference/analysis/token-graphs.asciidoc @@ -0,0 +1,104 @@ +[[token-graphs]] +=== Token graphs + +When a <> converts a text into a stream of +tokens, it also records the following: + +* The `position` of each token in the stream +* The `positionLength`, the number of positions that a token spans + +Using these, you can create a +https://en.wikipedia.org/wiki/Directed_acyclic_graph[directed acyclic graph], +called a _token graph_, for a stream. In a token graph, each position represents +a node. Each token represents an edge or arc, pointing to the next position. + +image::images/analysis/token-graph-qbf-ex.svg[align="center"] + +[[token-graphs-synonyms]] +==== Synonyms + +Some <> can add new tokens, like +synonyms, to an existing token stream. These synonyms often span the same +positions as existing tokens. + +In the following graph, `quick` and its synonym `fast` both have a position of +`0`. They span the same positions. + +image::images/analysis/token-graph-qbf-synonym-ex.svg[align="center"] + +[[token-graphs-multi-position-tokens]] +==== Multi-position tokens + +Some token filters can add tokens that span multiple positions. These can +include tokens for multi-word synonyms, such as using "atm" as a synonym for +"automatic teller machine." + +However, only some token filters, known as _graph token filters_, accurately +record the `positionLength` for multi-position tokens. This filters include: + +* <> +* <> + +In the following graph, `domain name system` and its synonym, `dns`, both have a +position of `0`. However, `dns` has a `positionLength` of `3`. Other tokens in +the graph have a default `positionLength` of `1`. + +image::images/analysis/token-graph-dns-synonym-ex.svg[align="center"] + +[[token-graphs-token-graphs-search]] +===== Using token graphs for search + +<> ignores the `positionLength` attribute +and does not support token graphs containing multi-position tokens. + +However, queries, such as the <> or +<> query, can use these graphs to +generate multiple sub-queries from a single query string. + +.*Example* +[%collapsible] +==== + +A user runs a search for the following phrase using the `match_phrase` query: + +`domain name system is fragile` + +During <>, `dns`, a synonym for +`domain name system`, is added to the query string's token stream. The `dns` +token has a `positionLength` of `3`. + +image::images/analysis/token-graph-dns-synonym-ex.svg[align="center"] + +The `match_phrase` query uses this graph to generate sub-queries for the +following phrases: + +[source,text] +------ +dns is fragile +domain name system is fragile +------ + +This means the query matches documents containing either `dns is fragile` _or_ +`domain name system is fragile`. +==== + +[[token-graphs-invalid-token-graphs]] +===== Invalid token graphs + +The following token filters can add tokens that span multiple positions but +only record a default `positionLength` of `1`: + +* <> +* <> + +This means these filters will produce invalid token graphs for streams +containing such tokens. + +In the following graph, `dns` is a multi-position synonym for `domain name +system`. However, `dns` has the default `positionLength` value of `1`, resulting +in an invalid graph. + +image::images/analysis/token-graph-dns-invalid-ex.svg[align="center"] + +Avoid using invalid token graphs for search. Invalid graphs can cause unexpected +search results. \ No newline at end of file diff --git a/docs/reference/analysis/tokenfilters/stemmer-tokenfilter.asciidoc b/docs/reference/analysis/tokenfilters/stemmer-tokenfilter.asciidoc index 4e98e24d08ef0..957aad084619f 100644 --- a/docs/reference/analysis/tokenfilters/stemmer-tokenfilter.asciidoc +++ b/docs/reference/analysis/tokenfilters/stemmer-tokenfilter.asciidoc @@ -54,7 +54,6 @@ http://snowball.tartarus.org/algorithms/basque/stemmer.html[*`basque`*] Bengali:: http://www.tandfonline.com/doi/abs/10.1080/02564602.1993.11437284[*`bengali`*] -http://members.unine.ch/jacques.savoy/clef/BengaliStemmerLight.java.txt[*`light_bengali`*] Brazilian Portuguese:: diff --git a/docs/reference/analysis/tokenfilters/synonym-graph-tokenfilter.asciidoc b/docs/reference/analysis/tokenfilters/synonym-graph-tokenfilter.asciidoc index e6bc76e408f23..582ce99b20bf7 100644 --- a/docs/reference/analysis/tokenfilters/synonym-graph-tokenfilter.asciidoc +++ b/docs/reference/analysis/tokenfilters/synonym-graph-tokenfilter.asciidoc @@ -8,8 +8,8 @@ The `synonym_graph` token filter allows to easily handle synonyms, including multi-word synonyms correctly during the analysis process. In order to properly handle multi-word synonyms this token filter -creates a "graph token stream" during processing. For more information -on this topic and its various complexities, please read the +creates a <> during processing. For more +information on this topic and its various complexities, please read the http://blog.mikemccandless.com/2012/04/lucenes-tokenstreams-are-actually.html[Lucene's TokenStreams are actually graphs] blog post. ["NOTE",id="synonym-graph-index-note"] diff --git a/docs/reference/analysis/tokenfilters/uppercase-tokenfilter.asciidoc b/docs/reference/analysis/tokenfilters/uppercase-tokenfilter.asciidoc index 780e09fa951cc..84c9ebd186595 100644 --- a/docs/reference/analysis/tokenfilters/uppercase-tokenfilter.asciidoc +++ b/docs/reference/analysis/tokenfilters/uppercase-tokenfilter.asciidoc @@ -16,7 +16,8 @@ Depending on the language, an uppercase character can map to multiple lowercase characters. Using the `uppercase` filter could result in the loss of lowercase character information. -To avoid this loss but still have a consistent lettercase, use the <> filter instead. +To avoid this loss but still have a consistent letter case, use the +<> filter instead. ==== [[analysis-uppercase-tokenfilter-analyze-ex]] diff --git a/docs/reference/analysis/tokenfilters/word-delimiter-graph-tokenfilter.asciidoc b/docs/reference/analysis/tokenfilters/word-delimiter-graph-tokenfilter.asciidoc index 8581d8cb7ec17..2fa9c41ad79b6 100644 --- a/docs/reference/analysis/tokenfilters/word-delimiter-graph-tokenfilter.asciidoc +++ b/docs/reference/analysis/tokenfilters/word-delimiter-graph-tokenfilter.asciidoc @@ -429,7 +429,7 @@ PUT /my_index [[analysis-word-delimiter-graph-differences]] ==== Differences between `word_delimiter_graph` and `word_delimiter` -Both the `word_delimiter_graph` and +Both the `word_delimiter_graph` and <> filters produce tokens that span multiple positions when any of the following parameters are `true`: @@ -440,8 +440,8 @@ that span multiple positions when any of the following parameters are `true`: However, only the `word_delimiter_graph` filter assigns multi-position tokens a `positionLength` attribute, which indicates the number of positions a token -spans. This ensures the `word_delimiter_graph` filter always produces valid token -https://en.wikipedia.org/wiki/Directed_acyclic_graph[graphs]. +spans. This ensures the `word_delimiter_graph` filter always produces valid +<>. The `word_delimiter` filter does not assign multi-position tokens a `positionLength` attribute. This means it produces invalid graphs for streams diff --git a/docs/reference/api-conventions.asciidoc b/docs/reference/api-conventions.asciidoc index 41a09c9a15ff6..3921aab0d1354 100644 --- a/docs/reference/api-conventions.asciidoc +++ b/docs/reference/api-conventions.asciidoc @@ -87,7 +87,7 @@ GET /%3Clogstash-%7Bnow%2Fd%7D%3E/_search } ---------------------------------------------------------------------- // TEST[s/^/PUT logstash-2016.09.20\n/] -// TEST[s/now/2016.09.20||/] +// TEST[s/now/2016.09.20%7C%7C/] [NOTE] .Percent encoding of date math characters @@ -141,7 +141,7 @@ GET /%3Clogstash-%7Bnow%2Fd-2d%7D%3E%2C%3Clogstash-%7Bnow%2Fd-1d%7D%3E%2C%3Clogs } ---------------------------------------------------------------------- // TEST[s/^/PUT logstash-2016.09.20\nPUT logstash-2016.09.19\nPUT logstash-2016.09.18\n/] -// TEST[s/now/2016.09.20||/] +// TEST[s/now/2016.09.20%7C%7C/] [[common-options]] === Common options @@ -367,7 +367,7 @@ GET /_search?filter_path=hits.hits._source&_source=title&sort=rating:desc [float] ==== Flat Settings -The `flat_settings` flag affects rendering of the lists of settings. When the +The `flat_settings` flag affects rendering of the lists of settings. When the `flat_settings` flag is `true`, settings are returned in a flat format: [source,console] diff --git a/docs/reference/async-search.asciidoc b/docs/reference/async-search.asciidoc new file mode 100644 index 0000000000000..d51e017b0f97c --- /dev/null +++ b/docs/reference/async-search.asciidoc @@ -0,0 +1,22 @@ +[role="xpack"] +[testenv="basic"] +[[async-search-intro]] +== Long-running searches + +{es} generally allows you to quickly search across big amounts of data. There are +situations where a search executes on many many shards, possibly against +<> and spanning multiple +<>, for which +results are not expected to be returned in milliseconds. When you need to +execute long-running searches, synchronously +waiting for its results to be returned is not ideal. Instead, Async search lets +you submit a search request that gets executed _asynchronously_, +monitor the progress of the request, and retrieve results at a later stage. +You can also retrieve partial results as they become available but +before the search has completed. + +You can submit an async search request using the <> API. The <> API allows you to +monitor the progress of an async search request and retrieve its results. An +ongoing async search can be deleted through the <> API. diff --git a/docs/reference/autoscaling/apis/get-autoscaling-decision.asciidoc b/docs/reference/autoscaling/apis/get-autoscaling-decision.asciidoc index aa66e3a0d034f..dfa14ac180636 100644 --- a/docs/reference/autoscaling/apis/get-autoscaling-decision.asciidoc +++ b/docs/reference/autoscaling/apis/get-autoscaling-decision.asciidoc @@ -47,6 +47,6 @@ The API returns the following result: [source,console-result] -------------------------------------------------- { - + decisions: [] } -------------------------------------------------- diff --git a/docs/reference/cat.asciidoc b/docs/reference/cat.asciidoc index d557a8c930a20..2b303e84ab955 100644 --- a/docs/reference/cat.asciidoc +++ b/docs/reference/cat.asciidoc @@ -255,16 +255,18 @@ include::cat/recovery.asciidoc[] include::cat/repositories.asciidoc[] -include::cat/tasks.asciidoc[] - -include::cat/thread_pool.asciidoc[] - -include::cat/trainedmodel.asciidoc[] - include::cat/shards.asciidoc[] include::cat/segments.asciidoc[] include::cat/snapshots.asciidoc[] +include::cat/tasks.asciidoc[] + include::cat/templates.asciidoc[] + +include::cat/thread_pool.asciidoc[] + +include::cat/trainedmodel.asciidoc[] + +include::cat/transforms.asciidoc[] diff --git a/docs/reference/cat/nodes.asciidoc b/docs/reference/cat/nodes.asciidoc index d6d67a35ef342..94b004a6f9f45 100644 --- a/docs/reference/cat/nodes.asciidoc +++ b/docs/reference/cat/nodes.asciidoc @@ -44,7 +44,7 @@ Valid columns are: `node.role`, `r`, `role`, `nodeRole`:: (Default) Roles of the node. Returned values include `d` (data node), `i` (ingest node), `m` (master-eligible node), `l` (machine learning node), `v` -(voting-only node), and `-` (coordinating node only). +(voting-only node), `t` ({transform} node), and `-` (coordinating node only). + For example, `dim` indicates a master-eligible data and ingest node. See <>. diff --git a/docs/reference/cat/transforms.asciidoc b/docs/reference/cat/transforms.asciidoc new file mode 100644 index 0000000000000..78013e394ef24 --- /dev/null +++ b/docs/reference/cat/transforms.asciidoc @@ -0,0 +1,28 @@ +[[cat-transforms]] +=== cat {transforms} API +++++ +cat transforms +++++ + +Returns configuration and usage information about {transforms}. + + +[[cat-transforms-api-request]] +==== {api-request-title} + +`GET /_cat/transforms` + + +//[[cat-transforms-api-desc]] +//==== {api-description-title} + + +//[[cat-transforms-api-query-params]] +//==== {api-query-parms-title} + + +//[[cat-transforms-api-response-codes]] +//==== {api-response-codes-title} + +//[[cat-transforms-api-examples]] +//==== {api-examples-title} diff --git a/docs/reference/frozen-indices.asciidoc b/docs/reference/frozen-indices.asciidoc index b6cafad30f5dc..5a2ef125a8cc3 100644 --- a/docs/reference/frozen-indices.asciidoc +++ b/docs/reference/frozen-indices.asciidoc @@ -74,8 +74,8 @@ POST /twitter/_forcemerge?max_num_segments=1 == Searching a frozen index Frozen indices are throttled in order to limit memory consumptions per node. The number of concurrently loaded frozen indices per node is -limited by the number of threads in the <> threadpool, which is `1` by default. -Search requests will not be executed against frozen indices by default, even if a frozen index is named explicitly. This is +limited by the number of threads in the <> threadpool, which is `1` by default. +Search requests will not be executed against frozen indices by default, even if a frozen index is named explicitly. This is to prevent accidental slowdowns by targeting a frozen index by mistake. To include frozen indices a search request must be executed with the query parameter `ignore_throttled=false`. @@ -85,15 +85,6 @@ GET /twitter/_search?q=user:kimchy&ignore_throttled=false -------------------------------------------------- // TEST[setup:twitter] -[IMPORTANT] -================================ -While frozen indices are slow to search, they can be pre-filtered efficiently. The request parameter `pre_filter_shard_size` specifies -a threshold that, when exceeded, will enforce a round-trip to pre-filter search shards that cannot possibly match. -This filter phase can limit the number of shards significantly. For instance, if a date range filter is applied, then all indices (frozen or unfrozen) that do not contain documents within the date range can be skipped efficiently. -The default value for `pre_filter_shard_size` is `128` but it's recommended to set it to `1` when searching frozen indices. There is no -significant overhead associated with this pre-filter phase. -================================ - [role="xpack"] [testenv="basic"] [[monitoring_frozen_indices]] diff --git a/docs/reference/images/analysis/token-graph-dns-ex.svg b/docs/reference/images/analysis/token-graph-dns-ex.svg new file mode 100644 index 0000000000000..0eda4fa54bd20 --- /dev/null +++ b/docs/reference/images/analysis/token-graph-dns-ex.svg @@ -0,0 +1,65 @@ + + + + Slice 1 + Created with Sketch. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/docs/reference/images/analysis/token-graph-dns-invalid-ex.svg b/docs/reference/images/analysis/token-graph-dns-invalid-ex.svg new file mode 100644 index 0000000000000..5614f39bfe35c --- /dev/null +++ b/docs/reference/images/analysis/token-graph-dns-invalid-ex.svg @@ -0,0 +1,72 @@ + + + + Slice 1 + Created with Sketch. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/docs/reference/images/analysis/token-graph-dns-synonym-ex.svg b/docs/reference/images/analysis/token-graph-dns-synonym-ex.svg new file mode 100644 index 0000000000000..cff5b1306b73b --- /dev/null +++ b/docs/reference/images/analysis/token-graph-dns-synonym-ex.svg @@ -0,0 +1,72 @@ + + + + Slice 1 + Created with Sketch. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/docs/reference/images/analysis/token-graph-qbf-ex.svg b/docs/reference/images/analysis/token-graph-qbf-ex.svg new file mode 100644 index 0000000000000..63970673092d4 --- /dev/null +++ b/docs/reference/images/analysis/token-graph-qbf-ex.svg @@ -0,0 +1,45 @@ + + + + Slice 1 + Created with Sketch. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/docs/reference/images/analysis/token-graph-qbf-synonym-ex.svg b/docs/reference/images/analysis/token-graph-qbf-synonym-ex.svg new file mode 100644 index 0000000000000..2baa3d9e63cb5 --- /dev/null +++ b/docs/reference/images/analysis/token-graph-qbf-synonym-ex.svg @@ -0,0 +1,52 @@ + + + + Slice 1 + Created with Sketch. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/docs/reference/index-modules/history-retention.asciidoc b/docs/reference/index-modules/history-retention.asciidoc index 6ace77c3533ff..fb4aa26ab9b07 100644 --- a/docs/reference/index-modules/history-retention.asciidoc +++ b/docs/reference/index-modules/history-retention.asciidoc @@ -54,12 +54,11 @@ reasonable recovery scenarios. `index.soft_deletes.enabled`:: + deprecated:[7.6.0, Creating indices with soft-deletes disabled is deprecated and will be removed in future Elasticsearch versions.] Whether or not soft deletes are enabled on the index. Soft deletes can only be configured at index creation and only on indices created on or after 6.5.0. The default value is `true`. - deprecated::[7.6, Creating indices with soft-deletes disabled is - deprecated and will be removed in future Elasticsearch versions.] `index.soft_deletes.retention_lease.period`:: diff --git a/docs/reference/index.asciidoc b/docs/reference/index.asciidoc index fcfc85d7184c5..bce1d615f264d 100644 --- a/docs/reference/index.asciidoc +++ b/docs/reference/index.asciidoc @@ -26,6 +26,8 @@ include::query-dsl.asciidoc[] include::modules/cross-cluster-search.asciidoc[] +include::async-search.asciidoc[] + include::scripting.asciidoc[] include::mapping.asciidoc[] diff --git a/docs/reference/indices/rollover-index.asciidoc b/docs/reference/indices/rollover-index.asciidoc index b75961a52559e..42c4d9a8a5b44 100644 --- a/docs/reference/indices/rollover-index.asciidoc +++ b/docs/reference/indices/rollover-index.asciidoc @@ -300,7 +300,7 @@ POST /logs_write/_rollover <2> } } -------------------------------------------------- -// TEST[s/now/2016.10.31||/] +// TEST[s/now/2016.10.31%7C%7C/] <1> Creates an index named with today's date (e.g.) `logs-2016.10.31-1` <2> Rolls over to a new index with today's date, e.g. `logs-2016.10.31-000002` if run immediately, or `logs-2016.11.01-000002` if run after 24 hours @@ -339,7 +339,7 @@ over indices created in the last three days, you could do the following: GET /%3Clogs-%7Bnow%2Fd%7D-*%3E%2C%3Clogs-%7Bnow%2Fd-1d%7D-*%3E%2C%3Clogs-%7Bnow%2Fd-2d%7D-*%3E/_search -------------------------------------------------- // TEST[continued] -// TEST[s/now/2016.10.31||/] +// TEST[s/now/2016.10.31%7C%7C/] [[rollover-index-api-dry-run-ex]] diff --git a/docs/reference/indices/templates.asciidoc b/docs/reference/indices/templates.asciidoc index 995efe28eea74..eeea74b5a544a 100644 --- a/docs/reference/indices/templates.asciidoc +++ b/docs/reference/indices/templates.asciidoc @@ -93,8 +93,6 @@ Name of the index template to create. If `true`, this request cannot replace or update existing index templates. Defaults to `false`. -include::{docdir}/rest-api/common-parms.asciidoc[tag=flat-settings] - `order`:: (Optional,integer) Order in which {es} applies this template @@ -104,7 +102,7 @@ Templates with lower `order` values are merged first. Templates with higher `order` values are merged later, overriding templates with lower values. -include::{docdir}/rest-api/common-parms.asciidoc[tag=timeoutparms] +include::{docdir}/rest-api/common-parms.asciidoc[tag=master-timeout] [[put-index-template-api-request-body]] diff --git a/docs/reference/ingest/processors/bytes.asciidoc b/docs/reference/ingest/processors/bytes.asciidoc index 76f054cac64c2..5a551f8a82eac 100644 --- a/docs/reference/ingest/processors/bytes.asciidoc +++ b/docs/reference/ingest/processors/bytes.asciidoc @@ -1,6 +1,6 @@ [[bytes-processor]] === Bytes Processor -Converts a human readable byte value (e.g. 1kb) to its value in bytes (e.g. 1024). +Converts a human readable byte value (e.g. 1kb) to its value in bytes (e.g. 1024). If the field is an array of strings, all members of the array will be converted. Supported human readable units are "b", "kb", "mb", "gb", "tb", "pb" case insensitive. An error will occur if the field is not a supported format or resultant value exceeds 2^63. diff --git a/docs/reference/ingest/processors/gsub.asciidoc b/docs/reference/ingest/processors/gsub.asciidoc index f6919eb1e95f1..2defa6e7cd138 100644 --- a/docs/reference/ingest/processors/gsub.asciidoc +++ b/docs/reference/ingest/processors/gsub.asciidoc @@ -1,7 +1,7 @@ [[gsub-processor]] === Gsub Processor Converts a string field by applying a regular expression and a replacement. -If the field is not a string, the processor will throw an exception. +If the field is an array of string, all members of the array will be converted. If any non-string values are encountered, the processor will throw an exception. [[gsub-options]] .Gsub Options diff --git a/docs/reference/ingest/processors/html_strip.asciidoc b/docs/reference/ingest/processors/html_strip.asciidoc index 2fa3cd7bbb8ae..bd4e8e8ccd920 100644 --- a/docs/reference/ingest/processors/html_strip.asciidoc +++ b/docs/reference/ingest/processors/html_strip.asciidoc @@ -1,6 +1,6 @@ [[htmlstrip-processor]] === HTML Strip Processor -Removes HTML from field. +Removes HTML tags from the field. If the field is an array of strings, HTML tags will be removed from all members of the array. NOTE: Each HTML tag is replaced with a `\n` character. diff --git a/docs/reference/ingest/processors/lowercase.asciidoc b/docs/reference/ingest/processors/lowercase.asciidoc index 878b74ed9ba24..903d69625352f 100644 --- a/docs/reference/ingest/processors/lowercase.asciidoc +++ b/docs/reference/ingest/processors/lowercase.asciidoc @@ -1,6 +1,6 @@ [[lowercase-processor]] === Lowercase Processor -Converts a string to its lowercase equivalent. +Converts a string to its lowercase equivalent. If the field is an array of strings, all members of the array will be converted. [[lowercase-options]] .Lowercase Options diff --git a/docs/reference/ingest/processors/trim.asciidoc b/docs/reference/ingest/processors/trim.asciidoc index 7c28767076ecc..ef3611161e2e2 100644 --- a/docs/reference/ingest/processors/trim.asciidoc +++ b/docs/reference/ingest/processors/trim.asciidoc @@ -1,6 +1,6 @@ [[trim-processor]] === Trim Processor -Trims whitespace from field. +Trims whitespace from field. If the field is an array of strings, all members of the array will be trimmed. NOTE: This only works on leading and trailing whitespace. diff --git a/docs/reference/ingest/processors/uppercase.asciidoc b/docs/reference/ingest/processors/uppercase.asciidoc index 7565be1c7c303..3e26cedcf9cce 100644 --- a/docs/reference/ingest/processors/uppercase.asciidoc +++ b/docs/reference/ingest/processors/uppercase.asciidoc @@ -1,6 +1,6 @@ [[uppercase-processor]] === Uppercase Processor -Converts a string to its uppercase equivalent. +Converts a string to its uppercase equivalent. If the field is an array of strings, all members of the array will be converted. [[uppercase-options]] .Uppercase Options diff --git a/docs/reference/ingest/processors/url-decode.asciidoc b/docs/reference/ingest/processors/url-decode.asciidoc index 76fc00c80f679..268fce1c18c2a 100644 --- a/docs/reference/ingest/processors/url-decode.asciidoc +++ b/docs/reference/ingest/processors/url-decode.asciidoc @@ -1,6 +1,6 @@ [[urldecode-processor]] === URL Decode Processor -URL-decodes a string +URL-decodes a string. If the field is an array of strings, all members of the array will be decoded. [[urldecode-options]] .URL Decode Options diff --git a/docs/reference/mapping.asciidoc b/docs/reference/mapping.asciidoc index 3f89571ae5808..96153f5e2a355 100644 --- a/docs/reference/mapping.asciidoc +++ b/docs/reference/mapping.asciidoc @@ -13,29 +13,22 @@ are stored and indexed. For instance, use mappings to define: * custom rules to control the mapping for <>. -[float] -[[mapping-type]] -== Mapping Type - -Each index has one _mapping type_ which determines how the document will be -indexed. - -deprecated::[6.0.0,See <>] - -A mapping type has: +A mapping definition has: <>:: Meta-fields are used to customize how a document's metadata associated is treated. Examples of meta-fields include the document's -<>, <>, -<>, and <> fields. +<>, <>, and +<> fields. <> or _properties_:: -A mapping type contains a list of fields or `properties` pertinent to the +A mapping contains a list of fields or `properties` pertinent to the document. +NOTE: Before 7.0.0, the 'mappings' definition used to include a type name. +For more details, please see <>. [float] [[field-datatypes]] diff --git a/docs/reference/mapping/fields.asciidoc b/docs/reference/mapping/fields.asciidoc index f6d5f00a9b5e0..0ea4b77441c42 100644 --- a/docs/reference/mapping/fields.asciidoc +++ b/docs/reference/mapping/fields.asciidoc @@ -15,7 +15,7 @@ can be customised when a mapping type is created. <>:: - The document's <>. + The document's mapping type. <>:: diff --git a/docs/reference/mapping/fields/routing-field.asciidoc b/docs/reference/mapping/fields/routing-field.asciidoc index 25c3571f08e8d..46a204ccddfde 100644 --- a/docs/reference/mapping/fields/routing-field.asciidoc +++ b/docs/reference/mapping/fields/routing-field.asciidoc @@ -92,7 +92,7 @@ PUT my_index2/_doc/1 <2> ------------------------------ // TEST[catch:bad_request] -<1> Routing is required for `_doc` documents. +<1> Routing is required for all documents. <2> This index request throws a `routing_missing_exception`. ==== Unique IDs with custom routing @@ -128,4 +128,4 @@ less than `index.number_of_shards`. Once enabled, the partitioned index will have the following limitations: * Mappings with <> relationships cannot be created within it. -* All mappings within the index must have the `_routing` field marked as required. \ No newline at end of file +* All mappings within the index must have the `_routing` field marked as required. diff --git a/docs/reference/mapping/fields/type-field.asciidoc b/docs/reference/mapping/fields/type-field.asciidoc index 2c5dc7195d643..4e6126cde9c9d 100644 --- a/docs/reference/mapping/fields/type-field.asciidoc +++ b/docs/reference/mapping/fields/type-field.asciidoc @@ -3,9 +3,9 @@ deprecated[6.0.0,See <>] -Each document indexed is associated with a <> (see -<>) and an <>. The `_type` field is -indexed in order to make searching by type name fast. +Each document indexed is associated with a <> and +an <>. The `_type` field is indexed in order to make +searching by type name fast. The value of the `_type` field is accessible in queries, aggregations, scripts, and when sorting: @@ -57,7 +57,6 @@ GET my_index/_search -------------------------- // TEST[warning:[types removal] Using the _type field in queries and aggregations is deprecated, prefer to use a field instead.] -// TEST[warning:[types removal] Looking up doc types [_type] in scripts is deprecated.] <1> Querying on the `_type` field <2> Aggregating on the `_type` field diff --git a/docs/reference/mapping/params/analyzer.asciidoc b/docs/reference/mapping/params/analyzer.asciidoc index 02f10363c44ea..47f0a5d68ec34 100644 --- a/docs/reference/mapping/params/analyzer.asciidoc +++ b/docs/reference/mapping/params/analyzer.asciidoc @@ -1,81 +1,23 @@ [[analyzer]] === `analyzer` -The values of <> fields are passed through an -<> to convert the string into a stream of _tokens_ or -_terms_. For instance, the string `"The quick Brown Foxes."` may, depending -on which analyzer is used, be analyzed to the tokens: `quick`, `brown`, -`fox`. These are the actual terms that are indexed for the field, which makes -it possible to search efficiently for individual words _within_ big blobs of -text. - -This analysis process needs to happen not just at index time, but also at -query time: the query string needs to be passed through the same (or a -similar) analyzer so that the terms that it tries to find are in the same -format as those that exist in the index. - -Elasticsearch ships with a number of <>, -which can be used without further configuration. It also ships with many -<>, <>, -and <> which can be combined to configure -custom analyzers per index. - -Analyzers can be specified per-query, per-field or per-index. At index time, -Elasticsearch will look for an analyzer in this order: - -* The `analyzer` defined in the field mapping. -* An analyzer named `default` in the index settings. -* The <> analyzer. - -At query time, there are a few more layers: - -* The `analyzer` defined in a <>. -* The `search_analyzer` defined in the field mapping. -* The `analyzer` defined in the field mapping. -* An analyzer named `default_search` in the index settings. -* An analyzer named `default` in the index settings. -* The <> analyzer. - -The easiest way to specify an analyzer for a particular field is to define it -in the field mapping, as follows: - -[source,console] --------------------------------------------------- -PUT /my_index -{ - "mappings": { - "properties": { - "text": { <1> - "type": "text", - "fields": { - "english": { <2> - "type": "text", - "analyzer": "english" - } - } - } - } - } -} - -GET my_index/_analyze <3> -{ - "field": "text", - "text": "The quick Brown Foxes." -} - -GET my_index/_analyze <4> -{ - "field": "text.english", - "text": "The quick Brown Foxes." -} --------------------------------------------------- - -<1> The `text` field uses the default `standard` analyzer`. -<2> The `text.english` <> uses the `english` analyzer, which removes stop words and applies stemming. -<3> This returns the tokens: [ `the`, `quick`, `brown`, `foxes` ]. -<4> This returns the tokens: [ `quick`, `brown`, `fox` ]. - +[IMPORTANT] +==== +Only <> fields support the `analyzer` mapping parameter. +==== + +The `analyzer` parameter specifies the <> used for +<> when indexing or searching a `text` field. + +Unless overridden with the <> mapping +parameter, this analyzer is used for both <>. See <>. + +[TIP] +==== +We recommend testing analyzers before using them in production. See +<>. +==== [[search-quote-analyzer]] ==== `search_quote_analyzer` diff --git a/docs/reference/mapping/removal_of_types.asciidoc b/docs/reference/mapping/removal_of_types.asciidoc index ec101080e1b31..ca028355de6f7 100644 --- a/docs/reference/mapping/removal_of_types.asciidoc +++ b/docs/reference/mapping/removal_of_types.asciidoc @@ -1,674 +1,6 @@ [[removal-of-types]] == Removal of mapping types -IMPORTANT: Indices created in Elasticsearch 7.0.0 or later no longer accept a -`_default_` mapping. Indices created in 6.x will continue to function as before -in Elasticsearch 6.x. Types are deprecated in APIs in 7.0, with breaking changes -to the index creation, put mapping, get mapping, put template, get template and -get field mappings APIs. - -[float] -=== What are mapping types? - -Since the first release of Elasticsearch, each document has been stored in a -single index and assigned a single mapping type. A mapping type was used to -represent the type of document or entity being indexed, for instance a -`twitter` index might have a `user` type and a `tweet` type. - -Each mapping type could have its own fields, so the `user` type might have a -`full_name` field, a `user_name` field, and an `email` field, while the -`tweet` type could have a `content` field, a `tweeted_at` field and, like the -`user` type, a `user_name` field. - -Each document had a `_type` meta-field containing the type name, and searches -could be limited to one or more types by specifying the type name(s) in the -URL: - -[source,js] ----- -GET twitter/user,tweet/_search -{ - "query": { - "match": { - "user_name": "kimchy" - } - } -} ----- -// NOTCONSOLE - -The `_type` field was combined with the document's `_id` to generate a `_uid` -field, so documents of different types with the same `_id` could exist in a -single index. - -Mapping types were also used to establish a -<> -between documents, so documents of type `question` could be parents to -documents of type `answer`. - -[float] -=== Why are mapping types being removed? - -Initially, we spoke about an ``index'' being similar to a ``database'' in an -SQL database, and a ``type'' being equivalent to a -``table''. - -This was a bad analogy that led to incorrect assumptions. In an SQL database, -tables are independent of each other. The columns in one table have no -bearing on columns with the same name in another table. This is not the case -for fields in a mapping type. - -In an Elasticsearch index, fields that have the same name in different mapping -types are backed by the same Lucene field internally. In other words, using -the example above, the `user_name` field in the `user` type is stored in -exactly the same field as the `user_name` field in the `tweet` type, and both -`user_name` fields must have the same mapping (definition) in both types. - -This can lead to frustration when, for example, you want `deleted` to be a -`date` field in one type and a `boolean` field in another type in the same -index. - -On top of that, storing different entities that have few or no fields in -common in the same index leads to sparse data and interferes with Lucene's -ability to compress documents efficiently. - -For these reasons, we have decided to remove the concept of mapping types from -Elasticsearch. - -[float] -=== Alternatives to mapping types - -[float] -==== Index per document type - -The first alternative is to have an index per document type. Instead of -storing tweets and users in a single `twitter` index, you could store tweets -in the `tweets` index and users in the `user` index. Indices are completely -independent of each other and so there will be no conflict of field types -between indices. - -This approach has two benefits: - -* Data is more likely to be dense and so benefit from compression techniques - used in Lucene. - -* The term statistics used for scoring in full text search are more likely to - be accurate because all documents in the same index represent a single - entity. - -Each index can be sized appropriately for the number of documents it will -contain: you can use a smaller number of primary shards for `users` and a -larger number of primary shards for `tweets`. - -[float] -==== Custom type field - -Of course, there is a limit to how many primary shards can exist in a cluster -so you may not want to waste an entire shard for a collection of only a few -thousand documents. In this case, you can implement your own custom `type` -field which will work in a similar way to the old `_type`. - -Let's take the `user`/`tweet` example above. Originally, the workflow would -have looked something like this: - -[source,js] ----- -PUT twitter -{ - "mappings": { - "user": { - "properties": { - "name": { "type": "text" }, - "user_name": { "type": "keyword" }, - "email": { "type": "keyword" } - } - }, - "tweet": { - "properties": { - "content": { "type": "text" }, - "user_name": { "type": "keyword" }, - "tweeted_at": { "type": "date" } - } - } - } -} - -PUT twitter/user/kimchy -{ - "name": "Shay Banon", - "user_name": "kimchy", - "email": "shay@kimchy.com" -} - -PUT twitter/tweet/1 -{ - "user_name": "kimchy", - "tweeted_at": "2017-10-24T09:00:00Z", - "content": "Types are going away" -} - -GET twitter/tweet/_search -{ - "query": { - "match": { - "user_name": "kimchy" - } - } -} ----- -// NOTCONSOLE - -You can achieve the same thing by adding a custom `type` field as follows: - -[source,js] ----- -PUT twitter -{ - "mappings": { - "_doc": { - "properties": { - "type": { "type": "keyword" }, <1> - "name": { "type": "text" }, - "user_name": { "type": "keyword" }, - "email": { "type": "keyword" }, - "content": { "type": "text" }, - "tweeted_at": { "type": "date" } - } - } - } -} - -PUT twitter/_doc/user-kimchy -{ - "type": "user", <1> - "name": "Shay Banon", - "user_name": "kimchy", - "email": "shay@kimchy.com" -} - -PUT twitter/_doc/tweet-1 -{ - "type": "tweet", <1> - "user_name": "kimchy", - "tweeted_at": "2017-10-24T09:00:00Z", - "content": "Types are going away" -} - -GET twitter/_search -{ - "query": { - "bool": { - "must": { - "match": { - "user_name": "kimchy" - } - }, - "filter": { - "match": { - "type": "tweet" <1> - } - } - } - } -} ----- -// NOTCONSOLE -<1> The explicit `type` field takes the place of the implicit `_type` field. - -[float] -[[parent-child-mapping-types]] -==== Parent/Child without mapping types - -Previously, a parent-child relationship was represented by making one mapping -type the parent, and one or more other mapping types the children. Without -types, we can no longer use this syntax. The parent-child feature will -continue to function as before, except that the way of expressing the -relationship between documents has been changed to use the new -<>. - - -[float] -=== Schedule for removal of mapping types - -This is a big change for our users, so we have tried to make it as painless as -possible. The change will roll out as follows: - -Elasticsearch 5.6.0:: - -* Setting `index.mapping.single_type: true` on an index will enable the - single-type-per-index behaviour which will be enforced in 6.0. - -* The <> replacement for parent-child is available - on indices created in 5.6. - -Elasticsearch 6.x:: - -* Indices created in 5.x will continue to function in 6.x as they did in 5.x. - -* Indices created in 6.x only allow a single-type per index. Any name - can be used for the type, but there can be only one. The preferred type name - is `_doc`, so that index APIs have the same path as they will have in 7.0: - `PUT {index}/_doc/{id}` and `POST {index}/_doc` - -* The `_type` name can no longer be combined with the `_id` to form the `_uid` - field. The `_uid` field has become an alias for the `_id` field. - -* New indices no longer support the old-style of parent/child and should - use the <> instead. - -* The `_default_` mapping type is deprecated. - -* In 6.8, the index creation, index template, and mapping APIs support a query - string parameter (`include_type_name`) which indicates whether requests and - responses should include a type name. It defaults to `true`, and should be set - to an explicit value to prepare to upgrade to 7.0. Not setting `include_type_name` - will result in a deprecation warning. Indices which don't have an explicit type will - use the dummy type name `_doc`. - -Elasticsearch 7.x:: - -* Specifying types in requests is deprecated. For instance, indexing a - document no longer requires a document `type`. The new index APIs - are `PUT {index}/_doc/{id}` in case of explicit ids and `POST {index}/_doc` - for auto-generated ids. Note that in 7.0, `_doc` is a permanent part of the - path, and represents the endpoint name rather than the document type. - -* The `include_type_name` parameter in the index creation, index template, - and mapping APIs will default to `false`. Setting the parameter at all will - result in a deprecation warning. - -* The `_default_` mapping type is removed. - -Elasticsearch 8.x:: - -* Specifying types in requests is no longer supported. - -* The `include_type_name` parameter is removed. - -[float] -=== Migrating multi-type indices to single-type - -The <> can be used to convert multi-type indices to -single-type indices. The following examples can be used in Elasticsearch 5.6 -or Elasticsearch 6.x. In 6.x, there is no need to specify -`index.mapping.single_type` as that is the default. - -[float] -==== Index per document type - -This first example splits our `twitter` index into a `tweets` index and a -`users` index: - -[source,js] ----- -PUT users -{ - "settings": { - "index.mapping.single_type": true - }, - "mappings": { - "_doc": { - "properties": { - "name": { - "type": "text" - }, - "user_name": { - "type": "keyword" - }, - "email": { - "type": "keyword" - } - } - } - } -} - -PUT tweets -{ - "settings": { - "index.mapping.single_type": true - }, - "mappings": { - "_doc": { - "properties": { - "content": { - "type": "text" - }, - "user_name": { - "type": "keyword" - }, - "tweeted_at": { - "type": "date" - } - } - } - } -} - -POST _reindex -{ - "source": { - "index": "twitter", - "type": "user" - }, - "dest": { - "index": "users", - "type": "_doc" - } -} - -POST _reindex -{ - "source": { - "index": "twitter", - "type": "tweet" - }, - "dest": { - "index": "tweets", - "type": "_doc" - } -} ----- -// NOTCONSOLE - -[float] -==== Custom type field - -This next example adds a custom `type` field and sets it to the value of the -original `_type`. It also adds the type to the `_id` in case there are any -documents of different types which have conflicting IDs: - -[source,js] ----- -PUT new_twitter -{ - "mappings": { - "_doc": { - "properties": { - "type": { - "type": "keyword" - }, - "name": { - "type": "text" - }, - "user_name": { - "type": "keyword" - }, - "email": { - "type": "keyword" - }, - "content": { - "type": "text" - }, - "tweeted_at": { - "type": "date" - } - } - } - } -} - - -POST _reindex -{ - "source": { - "index": "twitter" - }, - "dest": { - "index": "new_twitter" - }, - "script": { - "source": """ - ctx._source.type = ctx._type; - ctx._id = ctx._type + '-' + ctx._id; - ctx._type = '_doc'; - """ - } -} ----- -// NOTCONSOLE - -[float] -=== Typeless APIs in 7.0 - -In Elasticsearch 7.0, each API will support typeless requests, -and specifying a type will produce a deprecation warning. - -NOTE: Typeless APIs work even if the target index contains a custom type. -For example, if an index has the custom type name `my_type`, we can add -documents to it using typeless `index` calls, and load documents with typeless -`get` calls. - -[float] -==== Index APIs - -Index creation, index template, and mapping APIs support a new `include_type_name` -URL parameter that specifies whether mapping definitions in requests and responses -should contain the type name. The parameter defaults to `true` in version 6.8 to -match the pre-7.0 behavior of using type names in mappings. It defaults to `false` -in version 7.0 and will be removed in version 8.0. - -It should be set explicitly in 6.8 to prepare to upgrade to 7.0. To avoid deprecation -warnings in 6.8, the parameter can be set to either `true` or `false`. In 7.0, setting -`include_type_name` at all will result in a deprecation warning. - -See some examples of interactions with Elasticsearch with this option set to `false`: - -[source,console] --------------------------------------------------- -PUT index -{ - "mappings": { - "properties": { <1> - "foo": { - "type": "keyword" - } - } - } -} --------------------------------------------------- - -<1> Mappings are included directly under the `mappings` key, without a type name. - -[source,console] --------------------------------------------------- -PUT index/_mappings -{ - "properties": { <1> - "bar": { - "type": "text" - } - } -} --------------------------------------------------- -// TEST[continued] - -<1> Mappings are included directly under the `mappings` key, without a type name. - -[source,console] --------------------------------------------------- -GET index/_mappings --------------------------------------------------- -// TEST[continued] - -The above call returns - -[source,console-result] --------------------------------------------------- -{ - "index": { - "mappings": { - "properties": { <1> - "foo": { - "type": "keyword" - }, - "bar": { - "type": "text" - } - } - } - } -} --------------------------------------------------- - -<1> Mappings are included directly under the `mappings` key, without a type name. - -[float] -==== Document APIs - -In 7.0, index APIs must be called with the `{index}/_doc` path for automatic -generation of the `_id` and `{index}/_doc/{id}` with explicit ids. - -[source,console] --------------------------------------------------- -PUT index/_doc/1 -{ - "foo": "baz" -} --------------------------------------------------- - -[source,console-result] --------------------------------------------------- -{ - "_index": "index", - "_id": "1", - "_version": 1, - "result": "created", - "_shards": { - "total": 2, - "successful": 1, - "failed": 0 - }, - "_seq_no": 0, - "_primary_term": 1 -} --------------------------------------------------- - -Similarly, the `get` and `delete` APIs use the path `{index}/_doc/{id}`: - -[source,console] --------------------------------------------------- -GET index/_doc/1 --------------------------------------------------- -// TEST[continued] - -NOTE: In 7.0, `_doc` represents the endpoint name instead of the document type. -The `_doc` component is a permanent part of the path for the document `index`, -`get`, and `delete` APIs going forward, and will not be removed in 8.0. - -For API paths that contain both a type and endpoint name like `_update`, -in 7.0 the endpoint will immediately follow the index name: - -[source,console] --------------------------------------------------- -POST index/_update/1 -{ - "doc" : { - "foo" : "qux" - } -} - -GET /index/_source/1 --------------------------------------------------- -// TEST[continued] - -Types should also no longer appear in the body of requests. The following -example of bulk indexing omits the type both in the URL, and in the individual -bulk commands: - -[source,console] --------------------------------------------------- -POST _bulk -{ "index" : { "_index" : "index", "_id" : "3" } } -{ "foo" : "baz" } -{ "index" : { "_index" : "index", "_id" : "4" } } -{ "foo" : "qux" } --------------------------------------------------- - -[float] -==== Search APIs - -When calling a search API such `_search`, `_msearch`, or `_explain`, types -should not be included in the URL. Additionally, the `_type` field should not -be used in queries, aggregations, or scripts. - -[float] -==== Index templates - -It is recommended to make index templates typeless by re-adding them with -`include_type_name` set to `false`. Under the hood, typeless templates will use -the dummy type `_doc` when creating indices. - -In case typeless templates are used with typed index creation calls or typed -templates are used with typeless index creation calls, the template will still -be applied but the index creation call decides whether there should be a type -or not. For instance in the below example, `index-1-01` will have a type in -spite of the fact that it matches a template that is typeless, and `index-2-01` -will be typeless in spite of the fact that it matches a template that defines -a type. Both `index-1-01` and `index-2-01` will inherit the `foo` field from -the template that they match. - -[source,console] --------------------------------------------------- -PUT _template/template1 -{ - "index_patterns":[ "index-1-*" ], - "mappings": { - "properties": { - "foo": { - "type": "keyword" - } - } - } -} - -PUT index-1-01 -{ - "mappings": { - "properties": { - "bar": { - "type": "long" - } - } - } -} - -PUT index-2-01 -{ - "mappings": { - "properties": { - "bar": { - "type": "long" - } - } - } -} --------------------------------------------------- - -////////////////////////// - -[source,console] --------------------------------------------------- -DELETE /_template/template1 --------------------------------------------------- -// TEST[continued] - -////////////////////////// - -In case of implicit index creation, because of documents that get indexed in -an index that doesn't exist yet, the template is always honored. This is -usually not a problem due to the fact that typeless index calls work on typed -indices. - -[float] -==== Mixed-version clusters - -In a cluster composed of both 6.8 and 7.0 nodes, the parameter -`include_type_name` should be specified in index APIs like index -creation. This is because the parameter has a different default between -6.8 and 7.0, so the same mapping definition will not be valid for both -node versions. - -Typeless document APIs such as `bulk` and `update` are only available as of -7.0, and will not work with 6.8 nodes. This also holds true for the typeless -versions of queries that perform document lookups, such as `terms`. +Elasticsearch 8.0.0 no longer supports mapping types. For details on how to +migrate your clusters away from mapping types, see the +{ref-7x}/removal-of-types.html[removal of types] documentation for the 7.x release diff --git a/docs/reference/mapping/types/date_nanos.asciidoc b/docs/reference/mapping/types/date_nanos.asciidoc index aadae3238cdf9..c5dc9481d3191 100644 --- a/docs/reference/mapping/types/date_nanos.asciidoc +++ b/docs/reference/mapping/types/date_nanos.asciidoc @@ -94,6 +94,6 @@ same mapping parameters than with the `date` field can be used. [[date-nanos-limitations]] ==== Limitations -Aggregations are still on millisecond resolution, even when using a -`date_nanos` field. +Aggregations are still on millisecond resolution, even when using a `date_nanos` +field. This limitation also affects <>. diff --git a/docs/reference/mapping/types/wildcard.asciidoc b/docs/reference/mapping/types/wildcard.asciidoc index 51d10ff53ca92..aa2de7db87afc 100644 --- a/docs/reference/mapping/types/wildcard.asciidoc +++ b/docs/reference/mapping/types/wildcard.asciidoc @@ -8,6 +8,7 @@ A `wildcard` field stores values optimised for wildcard grep-like queries. Wildcard queries are possible on other field types but suffer from constraints: + * `text` fields limit matching of any wildcard expressions to individual tokens rather than the original whole value held in a field * `keyword` fields are untokenized but slow at performing wildcard queries (especially patterns with leading wildcards). @@ -47,6 +48,23 @@ POST my_index/_doc/_search -------------------------------------------------- +[[wildcard-params]] +==== Parameters for wildcard fields + +The following parameters are accepted by `wildcard` fields: + +[horizontal] + +<>:: + + Do not index any string longer than this value. Defaults to `2147483647` + so that all values would be accepted. + +<>:: + + How to pre-process the value prior to indexing. Defaults to `null`, + meaning the value is kept as-is. + ==== Limitations * `wildcard` fields are untokenized like keyword fields, so do not support queries that rely on word positions such as phrase queries. diff --git a/docs/reference/ml/anomaly-detection/apis/flush-job.asciidoc b/docs/reference/ml/anomaly-detection/apis/flush-job.asciidoc index 510e5b9a7b863..9bac3347dccaa 100644 --- a/docs/reference/ml/anomaly-detection/apis/flush-job.asciidoc +++ b/docs/reference/ml/anomaly-detection/apis/flush-job.asciidoc @@ -98,7 +98,7 @@ on January 1, 2018: -------------------------------------------------- POST _ml/anomaly_detectors/total-requests/_flush { - "advance_time": "1514804400" + "advance_time": "1514804400000" } -------------------------------------------------- // TEST[skip:setup:server_metrics_openjob] diff --git a/docs/reference/ml/anomaly-detection/apis/start-datafeed.asciidoc b/docs/reference/ml/anomaly-detection/apis/start-datafeed.asciidoc index 2dffa5e62481e..452944ef1e828 100644 --- a/docs/reference/ml/anomaly-detection/apis/start-datafeed.asciidoc +++ b/docs/reference/ml/anomaly-detection/apis/start-datafeed.asciidoc @@ -49,7 +49,7 @@ following formats: + - ISO 8601 format with milliseconds, for example `2017-01-22T06:00:00.000Z` - ISO 8601 format without milliseconds, for example `2017-01-22T06:00:00+00:00` -- Seconds from the Epoch, for example `1390370400` +- Milliseconds since the epoch, for example `1485061200000` Date-time arguments using either of the ISO 8601 formats must have a time zone designator, where Z is accepted as an abbreviation for UTC time. diff --git a/docs/reference/ml/ml-shared.asciidoc b/docs/reference/ml/ml-shared.asciidoc index 16c9f486d8d63..75dc0a946d636 100644 --- a/docs/reference/ml/ml-shared.asciidoc +++ b/docs/reference/ml/ml-shared.asciidoc @@ -1388,7 +1388,10 @@ tag::state-datafeed[] The status of the {dfeed}, which can be one of the following values: + -- +* `starting`: The {dfeed} has been requested to start but has not yet started. * `started`: The {dfeed} is actively receiving data. +* `stopping`: The {dfeed} has been requested to stop gracefully and is +completing its final action. * `stopped`: The {dfeed} is stopped and will not receive data until it is re-started. -- diff --git a/docs/reference/modules/gateway.asciidoc b/docs/reference/modules/gateway.asciidoc index 2b0783c9de0b0..c5bb5d1579c4f 100644 --- a/docs/reference/modules/gateway.asciidoc +++ b/docs/reference/modules/gateway.asciidoc @@ -10,12 +10,14 @@ recover the cluster state and the cluster's data: `gateway.expected_nodes`:: + deprecated:[7.7.0, This setting will be removed in 8.0. You should use `gateway.expected_data_nodes` instead.] The number of (data or master) nodes that are expected to be in the cluster. Recovery of local shards will start as soon as the expected number of nodes have joined the cluster. Defaults to `0` `gateway.expected_master_nodes`:: + deprecated:[7.7.0, This setting will be removed in 8.0. You should use `gateway.expected_data_nodes` instead.] The number of master nodes that are expected to be in the cluster. Recovery of local shards will start as soon as the expected number of master nodes have joined the cluster. Defaults to `0` @@ -37,10 +39,12 @@ as long as the following conditions are met: `gateway.recover_after_nodes`:: + deprecated:[7.7.0, This setting will be removed in 8.0. You should use `gateway.recover_after_data_nodes` instead.] Recover as long as this many data or master nodes have joined the cluster. `gateway.recover_after_master_nodes`:: + deprecated:[7.7.0, This setting will be removed in 8.0. You should use `gateway.recover_after_data_nodes` instead.] Recover as long as this many master nodes have joined the cluster. `gateway.recover_after_data_nodes`:: @@ -53,8 +57,8 @@ NOTE: These settings only take effect on a full cluster restart. === Dangling indices When a node joins the cluster, any shards stored in its local data -directory which do not already exist in the cluster will be imported into the -cluster. This functionality is intended as a best effort to help users who -lose all master nodes. If a new master node is started which is unaware of -the other indices in the cluster, adding the old nodes will cause the old +directory which do not already exist in the cluster will be imported into the +cluster. This functionality is intended as a best effort to help users who +lose all master nodes. If a new master node is started which is unaware of +the other indices in the cluster, adding the old nodes will cause the old indices to be imported, instead of being deleted. diff --git a/docs/reference/modules/node.asciidoc b/docs/reference/modules/node.asciidoc index 987ab35204e79..47b64ac4bd560 100644 --- a/docs/reference/modules/node.asciidoc +++ b/docs/reference/modules/node.asciidoc @@ -50,6 +50,13 @@ there must be at least one {ml} node in your cluster. For more information about IMPORTANT: If you use the {oss-dist}, do not set `node.ml`. Otherwise, the node fails to start. +<>:: + +A node that has `xpack.transform.enabled` and `node.transform` set to `true`. If +you want to use {transforms}, there must be at least one {transform} node in +your cluster. For more information, see <> and +<>. + [NOTE] [[coordinating-node]] .Coordinating node @@ -96,7 +103,7 @@ restarts. ==== Dedicated master-eligible node It is important for the health of the cluster that the elected master node has -the resources it needs to fulfil its responsibilities. If the elected master +the resources it needs to fulfill its responsibilities. If the elected master node is overloaded with other tasks then the cluster may not operate well. In particular, indexing and searching your data can be very resource-intensive, so in large or high-throughput clusters it is a good idea to avoid using the @@ -118,7 +125,9 @@ node.data: false <3> node.ingest: false <4> node.ml: false <5> xpack.ml.enabled: true <6> -cluster.remote.connect: false <7> +node.transform: false <7> +xpack.transform.enabled: true <8> +cluster.remote.connect: false <9> ------------------- <1> The `node.master` role is enabled by default. <2> The `node.voting_only` role is disabled by default. @@ -126,7 +135,9 @@ cluster.remote.connect: false <7> <4> Disable the `node.ingest` role (enabled by default). <5> Disable the `node.ml` role (enabled by default). <6> The `xpack.ml.enabled` setting is enabled by default. -<7> Disable remote cluster connections (enabled by default). +<7> Disable the `node.transform` role. +<8> The `xpack.transform.enabled` setting is enabled by default. +<9> Disable remote cluster connections (enabled by default). To create a dedicated master-eligible node in the {oss-dist}, set: @@ -197,7 +208,9 @@ node.data: false <3> node.ingest: false <4> node.ml: false <5> xpack.ml.enabled: true <6> -cluster.remote.connect: false <7> +node.transform: false <7> +xpack.transform.enabled: true <8> +cluster.remote.connect: false <9> ------------------- <1> The `node.master` role is enabled by default. <2> Enable the `node.voting_only` role (disabled by default). @@ -205,11 +218,13 @@ cluster.remote.connect: false <7> <4> Disable the `node.ingest` role (enabled by default). <5> Disable the `node.ml` role (enabled by default). <6> The `xpack.ml.enabled` setting is enabled by default. -<7> Disable remote cluster connections (enabled by default). +<7> Disable the `node.transform` role. +<8> The `xpack.transform.enabled` setting is enabled by default. +<9> Disable remote cluster connections (enabled by default). [float] [[data-node]] -=== Data Node +=== Data node Data nodes hold the shards that contain the documents you have indexed. Data nodes handle data related operations like CRUD, search, and aggregations. @@ -227,14 +242,16 @@ node.voting_only: false <2> node.data: true <3> node.ingest: false <4> node.ml: false <5> -cluster.remote.connect: false <6> +node.transform: false <6> +cluster.remote.connect: false <7> ------------------- <1> Disable the `node.master` role (enabled by default). <2> The `node.voting_only` role is disabled by default. <3> The `node.data` role is enabled by default. <4> Disable the `node.ingest` role (enabled by default). <5> Disable the `node.ml` role (enabled by default). -<6> Disable remote cluster connections (enabled by default). +<6> Disable the `node.transform` role. +<7> Disable remote cluster connections (enabled by default). To create a dedicated data node in the {oss-dist}, set: [source,yaml] @@ -251,7 +268,7 @@ cluster.remote.connect: false <4> [float] [[node-ingest-node]] -=== Ingest Node +=== Ingest node Ingest nodes can execute pre-processing pipelines, composed of one or more ingest processors. Depending on the type of operations performed by the ingest @@ -267,14 +284,16 @@ node.voting_only: false <2> node.data: false <3> node.ingest: true <4> node.ml: false <5> -cluster.remote.connect: false <6> +node.transform: false <6> +cluster.remote.connect: false <7> ------------------- <1> Disable the `node.master` role (enabled by default). <2> The `node.voting_only` role is disabled by default. <3> Disable the `node.data` role (enabled by default). <4> The `node.ingest` role is enabled by default. <5> Disable the `node.ml` role (enabled by default). -<6> Disable remote cluster connections (enabled by default). +<6> Disable the `node.transform` role. +<7> Disable remote cluster connections (enabled by default). To create a dedicated ingest node in the {oss-dist}, set: @@ -320,14 +339,16 @@ node.voting_only: false <2> node.data: false <3> node.ingest: false <4> node.ml: false <5> -cluster.remote.connect: false <6> +node.transform: false <6> +cluster.remote.connect: false <7> ------------------- <1> Disable the `node.master` role (enabled by default). <2> The `node.voting_only` role is disabled by default. <3> Disable the `node.data` role (enabled by default). <4> Disable the `node.ingest` role (enabled by default). <5> Disable the `node.ml` role (enabled by default). -<6> Disable remote cluster connections (enabled by default). +<6> Disable the `node.transform` role. +<7> Disable remote cluster connections (enabled by default). To create a dedicated coordinating node in the {oss-dist}, set: @@ -348,7 +369,7 @@ cluster.remote.connect: false <4> === [xpack]#Machine learning node# The {ml-features} provide {ml} nodes, which run jobs and handle {ml} API -requests. If `xpack.ml.enabled` is set to true and `node.ml` is set to `false`, +requests. If `xpack.ml.enabled` is set to `true` and `node.ml` is set to `false`, the node can service API requests but it cannot run jobs. If you want to use {ml-features} in your cluster, you must enable {ml} @@ -367,7 +388,9 @@ node.data: false <3> node.ingest: false <4> node.ml: true <5> xpack.ml.enabled: true <6> -cluster.remote.connect: false <7> +node.transform: false <7> +xpack.transform.enabled: true <8> +cluster.remote.connect: false <9> ------------------- <1> Disable the `node.master` role (enabled by default). <2> The `node.voting_only` role is disabled by default. @@ -375,7 +398,42 @@ cluster.remote.connect: false <7> <4> Disable the `node.ingest` role (enabled by default). <5> The `node.ml` role is enabled by default. <6> The `xpack.ml.enabled` setting is enabled by default. -<7> Disable remote cluster connections (enabled by default). +<7> Disable the `node.transform` role. +<8> The `xpack.transform.enabled` setting is enabled by default. +<9> Disable remote cluster connections (enabled by default). + +[discrete] +[[transform-node]] +=== [xpack]#{transform-cap} node# + +{transform-cap} nodes run {transforms} and handle {transform} API requests. + +If you want to use {transforms} in your cluster, you must have +`xpack.transform.enabled` set to `true` on all master-eligible nodes and all +data nodes. You must also have `node.transform` set to `true` on at least one +node. This is the default behavior. If you have the {oss-dist}, do not use these +settings. For more information, see <>. + +To create a dedicated {transform} node in the {default-dist}, set: + +[source,yaml] +------------------- +node.master: false <1> +node.voting_only: false <2> +node.data: false <3> +node.ingest: false <4> +node.ml: false <5> +node.transform: true <6> +xpack.transform.enabled: true <7> +cluster.remote.connect: false <8> +------------------- +<1> Disable the `node.master` role. +<2> Disable the `node.voting_only`. +<3> Disable the `node.data` role. +<4> Disable the `node.ingest` role. +<5> Enable the `node.ml` role. +<6> Enable the `xpack.ml.enabled` setting. +<7> Disable remote cluster connections. [float] [[change-node-role]] diff --git a/docs/reference/monitoring/how-monitoring-works.asciidoc b/docs/reference/monitoring/how-monitoring-works.asciidoc index dc1c54ce9b865..2a89f0aec6524 100644 --- a/docs/reference/monitoring/how-monitoring-works.asciidoc +++ b/docs/reference/monitoring/how-monitoring-works.asciidoc @@ -22,7 +22,7 @@ To learn how to collect monitoring data, see: * <> * <> * {kibana-ref}/xpack-monitoring.html[Monitoring {kib}] -* {logstash-ref}/monitoring-logstash.html[Monitoring {ls}] +* {logstash-ref}/configuring-logstash.html[Monitoring {ls}] * Monitoring Beats: ** {auditbeat-ref}/monitoring.html[{auditbeat}] ** {filebeat-ref}/monitoring.html[{filebeat}] diff --git a/docs/reference/query-dsl/geo-queries.asciidoc b/docs/reference/query-dsl/geo-queries.asciidoc index b3cc9112576cc..b4eb86763e702 100644 --- a/docs/reference/query-dsl/geo-queries.asciidoc +++ b/docs/reference/query-dsl/geo-queries.asciidoc @@ -18,9 +18,11 @@ Finds documents with geo-points within the specified distance of a central point Find documents with geo-points within the specified polygon. <> query:: -Finds documents with geo-shapes which either intersect, are contained by, or do not intersect with the specified -geo-shape. - +Finds documents with: +* `geo-shapes` which either intersect, are contained by, or do not intersect +with the specified geo-shape +* `geo-points` which intersect the specified +geo-shape include::geo-bounding-box-query.asciidoc[] diff --git a/docs/reference/query-dsl/geo-shape-query.asciidoc b/docs/reference/query-dsl/geo-shape-query.asciidoc index 9706b90d82845..966267d4e3537 100644 --- a/docs/reference/query-dsl/geo-shape-query.asciidoc +++ b/docs/reference/query-dsl/geo-shape-query.asciidoc @@ -4,9 +4,9 @@ Geo-shape ++++ -Filter documents indexed using the `geo_shape` type. +Filter documents indexed using the `geo_shape` or `geo_point` type. -Requires the <>. +Requires the <> or the <>. The `geo_shape` query uses the same grid square representation as the `geo_shape` mapping to find documents that have a shape that intersects @@ -142,7 +142,7 @@ GET /example/_search The <> mapping parameter determines which spatial relation operators may be used at search time. -The following is a complete list of spatial relation operators available: +The following is a complete list of spatial relation operators available when searching a field of type `geo_shape`: * `INTERSECTS` - (default) Return all documents whose `geo_shape` field intersects the query geometry. @@ -153,6 +153,11 @@ is within the query geometry. * `CONTAINS` - Return all documents whose `geo_shape` field contains the query geometry. +When searching a field of type `geo_point` there is a single supported spatial relation operator: + +* `INTERSECTS` - (default) Return all documents whose `geo_point` field +intersects the query geometry. + [float] ==== Ignore Unmapped @@ -162,6 +167,15 @@ querying multiple indexes which might have different mappings. When set to `false` (the default value) the query will throw an exception if the field is not mapped. +==== Shape Types supported for Geo-Point + +When searching a field of type `geo_point` the following shape types are not supported: + +* `POINT` +* `LINE` +* `MULTIPOINT` +* `MULTILINE` + ==== Notes Geo-shape queries on geo-shapes implemented with <> will not be executed if <> is set to false. diff --git a/docs/reference/query-dsl/match-query.asciidoc b/docs/reference/query-dsl/match-query.asciidoc index a2c912bf8aa48..ecd8d3dbca0c1 100644 --- a/docs/reference/query-dsl/match-query.asciidoc +++ b/docs/reference/query-dsl/match-query.asciidoc @@ -258,7 +258,7 @@ GET /_search The `match` query supports multi-terms synonym expansion with the <> token filter. When this filter is used, the parser creates a phrase query for each multi-terms synonyms. -For example, the following synonym: `"ny, new york" would produce:` +For example, the following synonym: `"ny, new york"` would produce: `(ny OR ("new york"))` diff --git a/docs/reference/query-dsl/range-query.asciidoc b/docs/reference/query-dsl/range-query.asciidoc index ea7e7e9529435..f98eef9ee0534 100644 --- a/docs/reference/query-dsl/range-query.asciidoc +++ b/docs/reference/query-dsl/range-query.asciidoc @@ -203,7 +203,7 @@ the entire month. `lte`:: + -- -Rounds up to the lastest millisecond. +Rounds up to the latest millisecond. For example, `2014-11-18||/M` rounds up to `2014-11-30T23:59:59.999`, including the entire month. diff --git a/docs/reference/query-dsl/terms-query.asciidoc b/docs/reference/query-dsl/terms-query.asciidoc index 85605457ea315..05d6a85566678 100644 --- a/docs/reference/query-dsl/terms-query.asciidoc +++ b/docs/reference/query-dsl/terms-query.asciidoc @@ -22,7 +22,7 @@ GET /_search "query" : { "terms" : { "user" : ["kimchy", "elasticsearch"], - "boost" : 1.0 + "boost" : 1.0 } } } @@ -93,16 +93,16 @@ To perform a terms lookup, use the following parameters. [[query-dsl-terms-lookup-params]] ====== Terms lookup parameters `index`:: -(Optional, string) Name of the index from which to fetch field values. +(Required, string) Name of the index from which to fetch field values. `id`:: -(Optional, string) <> of the document from which to fetch +(Required, string) <> of the document from which to fetch field values. `path`:: + -- -(Optional, string) Name of the field from which to fetch field values. {es} uses +(Required, string) Name of the field from which to fetch field values. {es} uses these values as search terms for the query. If the field values include an array of nested inner objects, you can access @@ -117,7 +117,7 @@ when the document was indexed, this parameter is required. [[query-dsl-terms-lookup-example]] ====== Terms lookup example -To see how terms lookup works, try the following example. +To see how terms lookup works, try the following example. . Create an index with a `keyword` field named `color`. + @@ -244,4 +244,4 @@ field, {es} returns both documents. } ---- // TESTRESPONSE[s/"took" : 17/"took" : $body.took/] --- \ No newline at end of file +-- diff --git a/docs/reference/redirects.asciidoc b/docs/reference/redirects.asciidoc index 9ab175ee4d8e4..03d54def6a054 100644 --- a/docs/reference/redirects.asciidoc +++ b/docs/reference/redirects.asciidoc @@ -120,7 +120,7 @@ See <>. [role="exclude",id="native-settings"] ==== Native realm settings -See <>. +See <>. [role="exclude",id="configuring-saml-realm"] === Configuring a SAML realm @@ -130,27 +130,27 @@ See <>. [role="exclude",id="saml-settings"] ==== SAML realm settings -See <>. +See <>. [role="exclude",id="_saml_realm_signing_settings"] ==== SAML realm signing settings -See <>. +See <>. [role="exclude",id="_saml_realm_encryption_settings"] ==== SAML realm encryption settings -See <>. +See <>. [role="exclude",id="_saml_realm_ssl_settings"] ==== SAML realm SSL settings -See <>. +See <>. [role="exclude",id="configuring-file-realm"] === Configuring a file realm -See <>. +See <>. [role="exclude",id="ldap-user-search"] === User search mode and user DN templates mode @@ -170,7 +170,7 @@ See <>. [role="exclude",id="ldap-ssl"] === Setting up SSL between Elasticsearch and LDAP -See <>. +See <>. [role="exclude",id="configuring-kerberos-realm"] === Configuring a Kerberos realm @@ -211,7 +211,7 @@ See <>. [role="exclude",id="mapping-roles-ad"] === Mapping Active Directory users and groups to roles -See <>. +See <>. [role="exclude",id="how-security-works"] === How security works @@ -237,9 +237,9 @@ See the details in This page was deleted. [[ml-datafeed-chunking-config]] -See the details in <>, <>, +See the details in <>, <>, [[ml-datafeed-delayed-data-check-config]] -<>, +<>, [[ml-datafeed-counts]] <>. @@ -323,7 +323,7 @@ See <>. [role="exclude",id="ml-dfa-analysis-objects"] === Analysis configuration objects -This page was deleted. +This page was deleted. See <>. [role="exclude",id="slm-api-delete"] @@ -376,7 +376,17 @@ See <>. See <> and <>. -[role="exclude",id="async-search"] -=== Asynchronous search +[role="exclude",id="indices-component-templates"] +=== Component template APIs coming::[7.x] + +[role="exclude",id="data-streams"] +=== Data stream APIs + +coming::[7.x] + +[role="exclude",id="cat-transform"] +=== cat transform API + +See <>. \ No newline at end of file diff --git a/docs/reference/rest-api/common-parms.asciidoc b/docs/reference/rest-api/common-parms.asciidoc index baaea29f7e7e6..d297b70fa5553 100644 --- a/docs/reference/rest-api/common-parms.asciidoc +++ b/docs/reference/rest-api/common-parms.asciidoc @@ -593,10 +593,12 @@ are supported: * <> * <> * <> +* <> * <> * <> * <> * <> +* <> * <> * <> * <> diff --git a/docs/reference/rollup/apis/get-job.asciidoc b/docs/reference/rollup/apis/get-job.asciidoc index de45f96549f7f..a13ac77bbb884 100644 --- a/docs/reference/rollup/apis/get-job.asciidoc +++ b/docs/reference/rollup/apis/get-job.asciidoc @@ -142,14 +142,16 @@ The API yields the following response: "index_total": 0, "search_failures": 0, "search_time_in_ms": 0, - "search_total": 0 + "search_total": 0, + "processing_time_in_ms": 0, + "processing_total": 0 } } ] } ---- -The `jobs` array contains a single job (`id: sensor`) since we requested a single job in the endpoint's URL. +The `jobs` array contains a single job (`id: sensor`) since we requested a single job in the endpoint's URL. If we add another job, we can see how multi-job responses are handled: [source,console] @@ -245,7 +247,9 @@ Which will yield the following response: "index_total": 0, "search_failures": 0, "search_time_in_ms": 0, - "search_total": 0 + "search_total": 0, + "processing_time_in_ms": 0, + "processing_total": 0 } }, { @@ -299,7 +303,9 @@ Which will yield the following response: "index_total": 0, "search_failures": 0, "search_time_in_ms": 0, - "search_total": 0 + "search_total": 0, + "processing_time_in_ms": 0, + "processing_total": 0 } } ] diff --git a/docs/reference/search.asciidoc b/docs/reference/search.asciidoc index e8a8a42da6a00..a33807cf698ac 100644 --- a/docs/reference/search.asciidoc +++ b/docs/reference/search.asciidoc @@ -152,6 +152,8 @@ high). This default value is `5`. include::search/search.asciidoc[] +include::search/async-search.asciidoc[] + include::search/uri-request.asciidoc[] include::search/request-body.asciidoc[] diff --git a/docs/reference/search/async-search.asciidoc b/docs/reference/search/async-search.asciidoc new file mode 100644 index 0000000000000..7f5957a94e41c --- /dev/null +++ b/docs/reference/search/async-search.asciidoc @@ -0,0 +1,219 @@ +[role="xpack"] +[testenv="basic"] +[[async-search]] +=== Async search + +The async search API let you asynchronously execute a +search request, monitor its progress, and retrieve partial results +as they become available. + +[[submit-async-search]] +==== Submit async search API + +Executes a search request asynchronously. It accepts the same +parameters and request body as the <>. + +[source,console,id=submit-async-search-date-histogram-example] +-------------------------------------------------- +POST /sales*/_async_search?size=0 +{ + "sort" : [ + { "date" : {"order" : "asc"} } + ], + "aggs" : { + "sale_date" : { + "date_histogram" : { + "field" : "date", + "calendar_interval": "1d" + } + } + } +} +-------------------------------------------------- +// TEST[setup:sales] +// TEST[s/size=0/size=0&wait_for_completion=10s&clean_on_completion=false/] + +The response contains an identifier of the search being executed. +You can use this ID to later retrieve the search's final results. +The currently available search +results are returned as part of the <> object. + +[source,console-result] +-------------------------------------------------- +{ + "id" : "FmRldE8zREVEUzA2ZVpUeGs2ejJFUFEaMkZ5QTVrSTZSaVN3WlNFVmtlWHJsdzoxMDc=", <1> + "is_partial" : true, <2> + "is_running" : true, <3> + "start_time_in_millis" : 1583945890986, + "expiration_time_in_millis" : 1584377890986, + "response" : { + "took" : 1122, + "timed_out" : false, + "num_reduce_phases" : 0, + "_shards" : { + "total" : 562, <4> + "successful" : 3, <5> + "skipped" : 0, + "failed" : 0 + }, + "hits" : { + "total" : { + "value" : 157483, <6> + "relation" : "gte" + }, + "max_score" : null, + "hits" : [ ] + } + } +} +-------------------------------------------------- +// TESTRESPONSE[s/FmRldE8zREVEUzA2ZVpUeGs2ejJFUFEaMkZ5QTVrSTZSaVN3WlNFVmtlWHJsdzoxMDc=/$body.id/] +// TESTRESPONSE[s/"is_partial" : true/"is_partial": $body.is_partial/] +// TESTRESPONSE[s/"is_running" : true/"is_running": $body.is_running/] +// TESTRESPONSE[s/1583945890986/$body.start_time_in_millis/] +// TESTRESPONSE[s/1584377890986/$body.expiration_time_in_millis/] +// TESTRESPONSE[s/"took" : 1122/"took": $body.response.took/] +// TESTRESPONSE[s/"num_reduce_phases" : 0,//] +// TESTRESPONSE[s/"total" : 562/"total": $body.response._shards.total/] +// TESTRESPONSE[s/"successful" : 3/"successful": $body.response._shards.successful/] +// TESTRESPONSE[s/"value" : 157483/"value": $body.response.hits.total.value/] +// TESTRESPONSE[s/"relation" : "gte"/"relation": $body.response.hits.total.relation/] +// TESTRESPONSE[s/"hits" : \[ \]\n\s\s\s\s\}/"hits" : \[\]},"aggregations": $body.response.aggregations/] + +<1> Identifier of the async search that can be used to monitor its progress, retrieve its results, and/or delete it. +<2> Whether the returned search results are partial or final +<3> Whether the search is still being executed or it has completed +<4> How many shards the search will be executed on, overall +<5> How many shards have successfully completed the search +<6> How many documents are currently matching the query, which belong to the shards that have already completed the search + +It is possible to block and wait until the search is completed up to a certain +timeout by providing the `wait_for_completion` parameter, which defaults to +`1` second. + +You can also specify how long the async search needs to be +available through the `keep_alive` parameter, which defaults to `5d` (five days). +Ongoing async searches and any saved search results are deleted after this +period. + +NOTE: When the primary sort of the results is an indexed field, shards get +sorted based on minimum and maximum value that they hold for that field, +hence partial results become available following the sort criteria that +was requested. + +The submit async search API supports the same <> +as the search API, though some have different default values: + +* `batched_reduce_size` defaults to `5`: this affects how often partial results +become available, which happens whenever shard results are reduced. A partial +reduction is performed every time the coordinating node has received a certain +number of new shard responses (`5` by default). +* `request_cache` defaults to `true` +* `pre_filter_shard_size` defaults to `1`: this is to enforce the execution of +a pre-filter roundtrip to retrieve statistics from each shard so that the ones +that surely don't hold any document matching the query get skipped. +* `ccs_minimize_roundtrips` defaults to `false`, which is also the only +supported value + +WARNING: Async search does not support <> +nor search requests that only include the <>. +{ccs} is supported only with <> +set to `false`. + +[[get-async-search]] +==== Get async search + +The get async search API retrieves the results of a previously submitted +async search request given its id. If the {es} {security-features} are enabled. +the access to the results of a specific async search is restricted to the user +that submitted it in the first place. + +[source,console,id=get-async-search-date-histogram-example] +-------------------------------------------------- +GET /_async_search/FmRldE8zREVEUzA2ZVpUeGs2ejJFUFEaMkZ5QTVrSTZSaVN3WlNFVmtlWHJsdzoxMDc= +-------------------------------------------------- +// TEST[continued s/FmRldE8zREVEUzA2ZVpUeGs2ejJFUFEaMkZ5QTVrSTZSaVN3WlNFVmtlWHJsdzoxMDc=/\${body.id}/] + +[source,console-result] +-------------------------------------------------- +{ + "id" : "FmRldE8zREVEUzA2ZVpUeGs2ejJFUFEaMkZ5QTVrSTZSaVN3WlNFVmtlWHJsdzoxMDc=", + "is_partial" : true, <1> + "is_running" : true, <2> + "start_time_in_millis" : 1583945890986, + "expiration_time_in_millis" : 1584377890986, <3> + "response" : { + "took" : 12144, + "timed_out" : false, + "num_reduce_phases" : 46, <4> + "_shards" : { + "total" : 562, <5> + "successful" : 188, + "skipped" : 0, + "failed" : 0 + }, + "hits" : { + "total" : { + "value" : 456433, + "relation" : "eq" + }, + "max_score" : null, + "hits" : [ ] + }, + "aggregations" : { <6> + "sale_date" : { + "buckets" : [] + } + } + } +} +-------------------------------------------------- +// TESTRESPONSE[s/FmRldE8zREVEUzA2ZVpUeGs2ejJFUFEaMkZ5QTVrSTZSaVN3WlNFVmtlWHJsdzoxMDc=/$body.id/] +// TESTRESPONSE[s/"is_partial" : true/"is_partial" : false/] +// TESTRESPONSE[s/"is_running" : true/"is_running" : false/] +// TESTRESPONSE[s/1583945890986/$body.start_time_in_millis/] +// TESTRESPONSE[s/1584377890986/$body.expiration_time_in_millis/] +// TESTRESPONSE[s/"took" : 12144/"took": $body.response.took/] +// TESTRESPONSE[s/"total" : 562/"total": $body.response._shards.total/] +// TESTRESPONSE[s/"successful" : 188/"successful": $body.response._shards.successful/] +// TESTRESPONSE[s/"value" : 456433/"value": $body.response.hits.total.value/] +// TESTRESPONSE[s/"buckets" : \[\]/"buckets": $body.response.aggregations.sale_date.buckets/] +// TESTRESPONSE[s/"num_reduce_phases" : 46,//] + +<1> Whether the returned search results are partial or final +<2> Whether the search is still being executed or it has completed +<3> When the async search will expire +<4> Indicates how many reduction of the results have been performed. If this +number increases compared to the last retrieved results, you can expect +additional results included in the search response +<5> Indicates how many shards have executed the query. Note that in order for +shard results to be included in the search response, they need to be reduced +first. +<6> Partial aggregations results, coming from the shards that have already +completed the execution of the query. + +The `wait_for_completion` parameter, which defaults to `1`, can also be provided +when calling the Get Async Search API, in order to wait for the search to be +completed up until the provided timeout. Final results will be returned if +available before the timeout expires, otherwise the currently available results +will be returned once the timeout expires. + +The `keep_alive` parameter specifies how long the async search should be +available in the cluster. When not specified, the `keep_alive` set with the +corresponding submit async request will be used. Otherwise, it is possible to +override such value and extend the validity of the request. When this period +expires, the search, if still running, is cancelled. If the search is +completed, its saved results are deleted. + +[[delete-async-search]] +==== Delete async search + +You can use the delete async search API to manually delete an async search +by ID. If the search is still running, the search request will be cancelled. +Otherwise, the saved search results are deleted. + +[source,console,id=delete-async-search-date-histogram-example] +-------------------------------------------------- +DELETE /_async_search/FmRldE8zREVEUzA2ZVpUeGs2ejJFUFEaMkZ5QTVrSTZSaVN3WlNFVmtlWHJsdzoxMDc= +-------------------------------------------------- +// TEST[continued s/FmRldE8zREVEUzA2ZVpUeGs2ejJFUFEaMkZ5QTVrSTZSaVN3WlNFVmtlWHJsdzoxMDc=/\${body.id}/] diff --git a/docs/reference/search/multi-search.asciidoc b/docs/reference/search/multi-search.asciidoc index f24d963f41ac3..d3751af655df4 100644 --- a/docs/reference/search/multi-search.asciidoc +++ b/docs/reference/search/multi-search.asciidoc @@ -23,7 +23,7 @@ GET twitter/_msearch ==== {api-description-title} The multi search API executes several searches from a single API request. -The format of the request is similar to the bulk API format and makes use +The format of the request is similar to the bulk API format and makes use of the newline delimited JSON (NDJSON) format. The structure is as follows: @@ -85,7 +85,7 @@ Maximum number of concurrent searches the multi search API can execute. -- (Optional, integer) Maximum number of concurrent shard requests that each sub-search request -executes per node. Defaults to `5`. +executes per node. Defaults to `5`. You can use this parameter to prevent a request from overloading a cluster. For example, a default request hits all indices in a cluster. This could cause shard @@ -103,8 +103,13 @@ Defines a threshold that enforces a pre-filter roundtrip to prefilter search shards based on query rewriting if the number of shards the search request expands to exceeds the threshold. This filter roundtrip can limit the number of shards significantly if for instance a shard can not match any documents based -on it's rewrite method i.e., if date filters are mandatory to match but the -shard bounds and the query are disjoint. Defaults to `128`. +on its rewrite method i.e., if date filters are mandatory to match but the +shard bounds and the query are disjoint. +When unspecified, the pre-filter phase is executed if any of these +conditions is met: + - The request targets more than `128` shards. + - The request targets one or more read-only index. + - The primary sort of the query targets an indexed field. `rest_total_hits_as_int`:: (Optional, boolean) @@ -121,7 +126,7 @@ to a specific shard. -- (Optional, string) Indicates whether global term and document frequencies should be used when -scoring returned documents. +scoring returned documents. Options are: @@ -134,7 +139,7 @@ This is usually faster but less accurate. Documents are scored using global term and document frequencies across all shards. This is usually slower but more accurate. -- - + `typed_keys`:: (Optional, boolean) Specifies whether aggregation and suggester names should be prefixed by their @@ -196,7 +201,7 @@ to a specific shard. -- (Optional, string) Indicates whether global term and document frequencies should be used when -scoring returned documents. +scoring returned documents. Options are: @@ -234,18 +239,18 @@ Number of hits to return. Defaults to `10`. ==== {api-response-body-title} `responses`:: - (array) Includes the search response and status code for each search request - matching its order in the original multi search request. If there was a - complete failure for a specific search request, an object with `error` message - and corresponding status code will be returned in place of the actual search + (array) Includes the search response and status code for each search request + matching its order in the original multi search request. If there was a + complete failure for a specific search request, an object with `error` message + and corresponding status code will be returned in place of the actual search response. [[search-multi-search-api-example]] ==== {api-examples-title} -The header part includes which index / indices to search on, the `search_type`, -`preference`, and `routing`. The body includes the typical search body request +The header part includes which index / indices to search on, the `search_type`, +`preference`, and `routing`. The body includes the typical search body request (including the `query`, `aggregations`, `from`, `size`, and so on). [source,js] @@ -308,7 +313,7 @@ See <> ==== Template support Much like described in <> for the _search resource, _msearch -also provides support for templates. Submit them like follows for inline +also provides support for templates. Submit them like follows for inline templates: [source,console] @@ -377,6 +382,6 @@ GET _msearch/template [[multi-search-partial-responses]] ==== Partial responses -To ensure fast responses, the multi search API will respond with partial results -if one or more shards fail. See <> for more +To ensure fast responses, the multi search API will respond with partial results +if one or more shards fail. See <> for more information. diff --git a/docs/reference/search/search.asciidoc b/docs/reference/search/search.asciidoc index 1e8e874df2a9b..255cb0048be3c 100644 --- a/docs/reference/search/search.asciidoc +++ b/docs/reference/search/search.asciidoc @@ -25,7 +25,7 @@ GET /twitter/_search?q=tag:wow [[search-search-api-desc]] ==== {api-description-title} -Allows you to execute a search query and get back search hits that match the +Allows you to execute a search query and get back search hits that match the query. The query can either be provided using a simple <>, or using a <>. @@ -33,8 +33,8 @@ query. The query can either be provided using a simple [[search-partial-responses]] ===== Partial responses -To ensure fast responses, the search API will respond with partial results if -one or more shards fail. See <> for more +To ensure fast responses, the search API will respond with partial results if +one or more shards fail. See <> for more information. [[search-search-api-path-params]] @@ -51,163 +51,167 @@ include::{docdir}/rest-api/common-parms.asciidoc[tag=allow-no-indices] Defaults to `true`. `allow_partial_search_results`:: - (Optional, boolean) Indicates if an error should be returned if there is a + (Optional, boolean) Indicates if an error should be returned if there is a partial search failure or timeout. Defaults to `true`. `analyzer`:: (Optional, string) Defines the analyzer to use for the query string. - + `analyze_wildcard`:: - (Optional, boolean) If `true`, wildcard and prefix queries will also be + (Optional, boolean) If `true`, wildcard and prefix queries will also be analyzed. Defaults to `false`. - + `batched_reduce_size`:: - (Optional, integer) The number of shard results that should be reduced at once - on the coordinating node. This value should be used as a protection mechanism - to reduce the memory overhead per search request if the potential number of + (Optional, integer) The number of shard results that should be reduced at once + on the coordinating node. This value should be used as a protection mechanism + to reduce the memory overhead per search request if the potential number of shards in the request can be large. Defaults to `512`. - + `ccs_minimize_roundtrips`:: - (Optional, boolean) Indicates whether network round-trips should be minimized + (Optional, boolean) Indicates whether network round-trips should be minimized as part of cross-cluster search requests execution. Defaults to `true`. - + `default_operator`:: - (Optional, string) The default operator for query string query (AND or OR). + (Optional, string) The default operator for query string query (AND or OR). Defaults to `OR`. - + `df`:: - (Optional, string) Defines the field to use as default where no field prefix + (Optional, string) Defines the field to use as default where no field prefix is given in the query string. - + `docvalue_fields`:: - (Optional, string) A comma-separated list of fields to return as the docvalue + (Optional, string) A comma-separated list of fields to return as the docvalue representation of a field for each hit. - + include::{docdir}/rest-api/common-parms.asciidoc[tag=expand-wildcards] + Defaults to `open`. - + `explain`:: - (Optional, boolean) If `true`, returns detailed information about score + (Optional, boolean) If `true`, returns detailed information about score computation as part of a hit. Defaults to `false`. - + `from`:: (Optional, integer) Defines the starting offset. Defaults to `0`. `ignore_throttled`:: - (Optional, boolean) If `true`, concrete, expanded or aliased indices will be + (Optional, boolean) If `true`, concrete, expanded or aliased indices will be ignored when throttled. Defaults to `false`. include::{docdir}/rest-api/common-parms.asciidoc[tag=index-ignore-unavailable] - + `lenient`:: - (Optional, boolean) If `true`, format-based query failures (such as + (Optional, boolean) If `true`, format-based query failures (such as providing text to a numeric field) will be ignored. Defaults to `false`. - + `max_concurrent_shard_requests`:: - (Optional, integer) Defines the number of concurrent shard requests per node - this search executes concurrently. This value should be used to limit the - impact of the search on the cluster in order to limit the number of concurrent + (Optional, integer) Defines the number of concurrent shard requests per node + this search executes concurrently. This value should be used to limit the + impact of the search on the cluster in order to limit the number of concurrent shard requests. Defaults to `5`. - + `pre_filter_shard_size`:: - (Optional, integer) Defines a threshold that enforces a pre-filter roundtrip - to prefilter search shards based on query rewriting if the number of shards - the search request expands to exceeds the threshold. This filter roundtrip can - limit the number of shards significantly if for instance a shard can not match - any documents based on it's rewrite method ie. if date filters are mandatory - to match but the shard bounds and the query are disjoint. Defaults to `128`. + (Optional, integer) Defines a threshold that enforces a pre-filter roundtrip + to prefilter search shards based on query rewriting if the number of shards + the search request expands to exceeds the threshold. This filter roundtrip can + limit the number of shards significantly if for instance a shard can not match + any documents based on its rewrite method ie. if date filters are mandatory + to match but the shard bounds and the query are disjoint. + When unspecified, the pre-filter phase is executed if any of these conditions is met: + - The request targets more than `128` shards. + - The request targets one or more read-only index. + - The primary sort of the query targets an indexed field. `preference`:: - (Optional, string) Specifies the node or shard the operation should be + (Optional, string) Specifies the node or shard the operation should be performed on. Random by default. - + `q`:: (Optional, string) Query in the Lucene query string syntax. `request_cache`:: - (Optional, boolean) If `true`, request cache will be used for this request. + (Optional, boolean) If `true`, request cache will be used for this request. Defaults to index level settings. - + `rest_total_hits_as_int`:: - (Optional, boolean) Indicates whether hits.total should be rendered as an + (Optional, boolean) Indicates whether hits.total should be rendered as an integer or an object in the rest search response. Defaults to `false`. `routing`:: - (Optional, <>) Specifies how long a consistent view of + (Optional, <>) Specifies how long a consistent view of the index should be maintained for scrolled search. - + `search_type`:: - (Optional, string) Defines the type of the search operation. Available + (Optional, string) Defines the type of the search operation. Available options: * `query_then_fetch` * `dfs_query_then_fetch` `seq_no_primary_term`:: - (Optional, boolean) If `true`, returns sequence number and primary term of the + (Optional, boolean) If `true`, returns sequence number and primary term of the last modification of each hit. `size`:: (Optional, integer) Defines the number of hits to return. Defaults to `10`. - + `sort`:: (Optional, string) A comma-separated list of : pairs. - + `_source`:: - (Optional, string) True or false to return the `_source` field or not, or a + (Optional, string) True or false to return the `_source` field or not, or a list of fields to return. - + `_source_excludes`:: - (Optional, string) A list of fields to exclude from the returned `_source` + (Optional, string) A list of fields to exclude from the returned `_source` field. - + `_source_includes`:: - (Optional, string) A list of fields to extract and return from the `_source` + (Optional, string) A list of fields to extract and return from the `_source` field. - + `stats`:: - (Optional, string) Specific `tag` of the request for logging and statistical + (Optional, string) Specific `tag` of the request for logging and statistical purposes. `stored_fields`:: - (Optional, string) A comma-separated list of stored fields to return as part + (Optional, string) A comma-separated list of stored fields to return as part of a hit. - + `suggest_field`:: (Optional, string) Specifies which field to use for suggestions. - + `suggest_mode`:: - (Optional, string) Specifies suggest mode. Defaults to `missing`. Available + (Optional, string) Specifies suggest mode. Defaults to `missing`. Available options: * `always` * `missing` * `popular` - + `suggest_size`:: (Optional, integer) Defines how many suggestions to return in response. - + `suggest_text`:: - (Optional, string) The source text for which the suggestions should be + (Optional, string) The source text for which the suggestions should be returned. - + `terminate_after`:: - (Optional, integer) The maximum number of documents to collect for each shard, + (Optional, integer) The maximum number of documents to collect for each shard, upon reaching which the query execution will terminate early. - + include::{docdir}/rest-api/common-parms.asciidoc[tag=timeout] `track_scores`:: - (Optional, boolean) If `true`, then calculates and returns scores even if they + (Optional, boolean) If `true`, then calculates and returns scores even if they are not used for sorting. - + `track_total_hits`:: - (Optional, boolean) Indicates if the number of documents that match the query + (Optional, boolean) Indicates if the number of documents that match the query should be tracked. - + `typed_keys`:: - (Optional, boolean) Specifies whether aggregation and suggester names should + (Optional, boolean) Specifies whether aggregation and suggester names should be prefixed by their respective types in the response. - + `version`:: (Optional, boolean) If `true`, returns document version as part of a hit. @@ -216,7 +220,7 @@ include::{docdir}/rest-api/common-parms.asciidoc[tag=timeout] ==== {api-request-body-title} `query`:: - (Optional, <>) Defines the search definition using the + (Optional, <>) Defines the search definition using the <>. diff --git a/docs/reference/settings/monitoring-settings.asciidoc b/docs/reference/settings/monitoring-settings.asciidoc index 565780cc5b7a7..7ccf510b253de 100644 --- a/docs/reference/settings/monitoring-settings.asciidoc +++ b/docs/reference/settings/monitoring-settings.asciidoc @@ -17,10 +17,8 @@ file. To adjust how monitoring data is displayed in the monitoring UI, configure {kibana-ref}/monitoring-settings-kb.html[`xpack.monitoring` settings] in -`kibana.yml`. To control how monitoring data is collected from -Logstash, configure -{logstash-ref}/monitoring-internal-collection.html#monitoring-settings[`xpack.monitoring` settings] -in `logstash.yml`. +`kibana.yml`. To control how monitoring data is collected from Logstash, +configure monitoring settings in `logstash.yml`. For more information, see <>. diff --git a/docs/reference/sql/functions/date-time.asciidoc b/docs/reference/sql/functions/date-time.asciidoc index 6a9b0c6f8a8e3..b0be286f39c0a 100644 --- a/docs/reference/sql/functions/date-time.asciidoc +++ b/docs/reference/sql/functions/date-time.asciidoc @@ -500,18 +500,19 @@ include-tagged::{sql-specs}/docs/docs.csv-spec[datePartDateTimeTzOffsetMinus] -------------------------------------------------- DATE_TRUNC( string_exp, <1> - datetime_exp) <2> + datetime_exp/interval_exp) <2> -------------------------------------------------- *Input*: -<1> string expression denoting the unit to which the date/datetime should be truncated to -<2> date/datetime expression +<1> string expression denoting the unit to which the date/datetime/interval should be truncated to +<2> date/datetime/interval expression -*Output*: datetime +*Output*: datetime/interval -*Description*: Truncate the date/datetime to the specified unit by setting all fields that are less significant than the specified +*Description*: Truncate the date/datetime/interval to the specified unit by setting all fields that are less significant than the specified one to zero (or one, for day, day of week and month). If any of the two arguments is `null` a `null` is returned. +If the first argument is `week` and the second argument is of `interval` type, an error is thrown since the `interval` data type doesn't support a `week` time unit. [cols="^,^"] |=== @@ -563,6 +564,21 @@ include-tagged::{sql-specs}/docs/docs.csv-spec[truncateDateDecades] include-tagged::{sql-specs}/docs/docs.csv-spec[truncateDateQuarter] -------------------------------------------------- +[source, sql] +-------------------------------------------------- +include-tagged::{sql-specs}/docs/docs.csv-spec[truncateIntervalCenturies] +-------------------------------------------------- + +[source, sql] +-------------------------------------------------- +include-tagged::{sql-specs}/docs/docs.csv-spec[truncateIntervalHour] +-------------------------------------------------- + +[source, sql] +-------------------------------------------------- +include-tagged::{sql-specs}/docs/docs.csv-spec[truncateIntervalDay] +-------------------------------------------------- + [[sql-functions-datetime-day]] ==== `DAY_OF_MONTH/DOM/DAY` diff --git a/docs/reference/transform/apis/preview-transform.asciidoc b/docs/reference/transform/apis/preview-transform.asciidoc index 3ba1b945376cc..77effafa4aeea 100644 --- a/docs/reference/transform/apis/preview-transform.asciidoc +++ b/docs/reference/transform/apis/preview-transform.asciidoc @@ -30,7 +30,17 @@ on the source index for the {transform}. For more information, see This API generates a preview of the results that you will get when you run the <> with the same configuration. It returns a maximum of 100 results. The calculations are based -on all the current data in the source index. +on all the current data in the source index. + +It also generates a list of mappings and settings for the destination index. +If the destination index does not exist when you start a {transform}, these are +the mappings and settings that are used. These values are determined based on +the field types of the source index and the {transform} aggregations. + +TIP: There are some <> that +might result in poor mappings. As a work-around, create the destination index +or an index template with your preferred mappings before you start the +{transform}. [[preview-transform-request-body]] ==== {api-request-body-title} @@ -106,7 +116,16 @@ include::{docdir}/rest-api/common-parms.asciidoc[tag=sync-time-field] `preview`:: (array) An array of documents. In particular, they are the JSON representation of the documents that would be created in the destination index - by the {transform}. + by the {transform}. + +`generated_dest_index`:: + (object) Contains details about the destination index. + `mappings`::: + (object) The <> for each document in the destination index. + `settings`::: + (object) The <> for the destination + index. + `aliases`::: The aliases for the destination index. ==== {api-examples-title} @@ -156,15 +175,34 @@ The data that is returned for this example is as follows: } ... ], - "mappings": { - "properties": { - "max_price": { - "type": "double" + "generated_dest_index" : { + "mappings" : { + "_meta" : { + "_transform" : { + "transform" : "transform-preview", + "version" : { + "created" : "7.7.0" + }, + "creation_date_in_millis" : 1584738236757 + }, + "created_by" : "transform" }, - "customer_id": { - "type": "keyword" + "properties" : { + "max_price" : { + "type" : "half_float" + }, + "customer_id" : { + "type" : "keyword" + } } - } + }, + "settings" : { + "index" : { + "number_of_shards" : "1", + "auto_expand_replicas" : "0-1" + } + }, + "aliases" : { } } } ---- diff --git a/docs/reference/transform/ecommerce-tutorial.asciidoc b/docs/reference/transform/ecommerce-tutorial.asciidoc index 99391438aae0c..16b45fde1ee09 100644 --- a/docs/reference/transform/ecommerce-tutorial.asciidoc +++ b/docs/reference/transform/ecommerce-tutorial.asciidoc @@ -8,21 +8,11 @@ from an {es} index, transform it, and store it in another index. Let's use the {kibana-ref}/add-sample-data.html[{kib} sample data] to demonstrate how you can pivot and summarize your data with {transforms}. - -. If the {es} {security-features} are enabled, obtain a user ID with sufficient -privileges to complete these steps. -+ --- -You need `manage_transform` cluster privileges to preview and create -{transforms}. Members of the built-in `transform_admin` role have these -privileges. - -You also need `read` and `view_index_metadata` index privileges on the source -index and `read`, `create_index`, and `index` privileges on the destination -index. - -For more information, see <> and <>. --- +. Verify that your environment is set up properly to use {transforms}. If the +{es} {security-features} are enabled, to complete this tutorial you need a user +that has authority to preview and create {transforms}. You must also have +specific index privileges for the source and destination indices. See +<>. . Choose your _source index_. + @@ -268,3 +258,7 @@ TIP: If you do not want to keep the {transform}, you can delete it in <>. When you delete a {transform}, its destination index and {kib} index patterns remain. + +Now that you've created a simple {transform} for {kib} sample data, consider +possible use cases for your own data. For more ideas, see +<> and <>. \ No newline at end of file diff --git a/docs/reference/transform/index.asciidoc b/docs/reference/transform/index.asciidoc index 595cbdef56d98..84d3afd9e9cb7 100644 --- a/docs/reference/transform/index.asciidoc +++ b/docs/reference/transform/index.asciidoc @@ -11,18 +11,22 @@ indices that summarize the behavior of users or sessions or other entities in your data. * <> +* <> * <> * <> * <> * <> +* <> * <> * <> include::overview.asciidoc[] +include::setup.asciidoc[] include::usage.asciidoc[] include::checkpoints.asciidoc[] include::api-quickref.asciidoc[] include::ecommerce-tutorial.asciidoc[] include::examples.asciidoc[] +include::painless-examples.asciidoc[] include::troubleshooting.asciidoc[] include::limitations.asciidoc[] \ No newline at end of file diff --git a/docs/reference/transform/limitations.asciidoc b/docs/reference/transform/limitations.asciidoc index c4c035e88fbcf..88430cf37db94 100644 --- a/docs/reference/transform/limitations.asciidoc +++ b/docs/reference/transform/limitations.asciidoc @@ -46,17 +46,6 @@ A single cluster will support up to 1,000 {transforms}. When using the is returned. Use the `size` and `from` parameters to enumerate through the full list. - -[float] -[[transform-node-assignment-limitations]] -==== {transforms-cap} node assignment not configurable - -{transforms-cap} persistent tasks are assigned to the data node running -fewest persistent tasks at the time of assignment. This cannot be customized. -It means that if {transforms} are being used then `xpack.transform.enabled` -must be set to `true` (which is the default) on every data node in the cluster. - - [float] [[transform-aggresponse-limitations]] ==== Aggregation responses may be incompatible with destination index mappings @@ -217,3 +206,10 @@ this entity will not be updated. If using a `sync.time.field` that represents the data ingest time and using a zero second or very small `sync.time.delay`, then it is more likely that this issue will occur. + +[[transform-date-nanos]] +==== Support for date nanoseconds data type + +If your data uses the <>, aggregations +are nonetheless on millisecond resolution. This limitation also affects the +aggregations in your {transforms}. \ No newline at end of file diff --git a/docs/reference/transform/overview.asciidoc b/docs/reference/transform/overview.asciidoc index 5edc8a9650eec..409928ff438c1 100644 --- a/docs/reference/transform/overview.asciidoc +++ b/docs/reference/transform/overview.asciidoc @@ -28,20 +28,18 @@ The second step is deciding how you want to aggregate the grouped data. When using aggregations, you practically ask questions about the index. There are different types of aggregations, each with its own purpose and output. To learn more about the supported aggregations and group-by fields, see -{ref}/transform-resource.html[{transform-cap} resources]. +<>. As an optional step, you can also add a query to further limit the scope of the aggregation. The {transform} performs a composite aggregation that paginates through all the data defined by the source index query. The output of the aggregation is stored -in a destination index. Each time the {transform} queries the source index, it +in a _destination index_. Each time the {transform} queries the source index, it creates a _checkpoint_. You can decide whether you want the {transform} to run -once (batch {transform}) or continuously ({ctransform}). A batch {transform} is a -single operation that has a single checkpoint. {ctransforms-cap} continually -increment and process checkpoints as new source data is ingested. - -.Example +once or continuously. A _batch {transform}_ is a single operation that has a +single checkpoint. _{ctransforms-cap}_ continually increment and process +checkpoints as new source data is ingested. Imagine that you run a webshop that sells clothes. Every order creates a document that contains a unique order ID, the name and the category of the @@ -62,3 +60,19 @@ image::images/pivot-preview.jpg["Example of a {transform} pivot in {kib}"] IMPORTANT: The {transform} leaves your source index intact. It creates a new index that is dedicated to the transformed data. + +[[transform-performance]] +==== Performance considerations + +{transforms-cap} perform search aggregations on the source +indices then index the results into the destination index. Therefore, a +{transform} never takes less time than the cumulated duration of the +aggregation that it performs and the indexing process. + +For better performance, make sure that your search aggregations and queries are +optimized and that your {transform} is processing only necessary data. + +NOTE: When you use <>, the +queries are not considered optimal as they run through a significant amount of +data. For this reason, {transforms} performing date histogram aggregations take +longer to run. diff --git a/docs/reference/transform/painless-examples.asciidoc b/docs/reference/transform/painless-examples.asciidoc new file mode 100644 index 0000000000000..8a2a4ec7386e8 --- /dev/null +++ b/docs/reference/transform/painless-examples.asciidoc @@ -0,0 +1,329 @@ +[role="xpack"] +[testenv="basic"] +[[transform-painless-examples]] +=== Painless examples for {transforms} +++++ +Painless examples for {transforms} +++++ + +These examples demonstrate how to use Painless in {transforms}. You can learn +more about the Painless scripting language in the +{painless}/painless-guide.html[Painless guide]. + +* <> +* <> +* <> +* <> + + +[discrete] +[[painless-top-hits]] +==== Getting top hits by using scripted metric + +This snippet shows how to find the latest document, in other words the document +with the earliest timestamp. From a technical perspective, it helps to achieve +the function of a <> by using +scripted metric aggregation which provides a metric output. + +[source,js] +-------------------------------------------------- +"latest_doc": { + "scripted_metric": { + "init_script": "state.timestamp_latest = 0L; state.last_doc = ''", <1> + "map_script": """ <2> + def current_date = doc['@timestamp'].getValue().toInstant().toEpochMilli(); + if (current_date > state.timestamp_latest) + {state.timestamp_latest = current_date; + state.last_doc = new HashMap(params['_source']);} + """, + "combine_script": "return state", <3> + "reduce_script": """ <4> + def last_doc = ''; + def timestamp_latest = 0L; + for (s in states) {if (s.timestamp_latest > (timestamp_latest)) + {timestamp_latest = s.timestamp_latest; last_doc = s.last_doc;}} + return last_doc + """ + } +} +-------------------------------------------------- +// NOTCONSOLE + +<1> The `init_script` creates a long type `timestamp_latest` and a string type +`last_doc` in the `state` object. +<2> The `map_script` defines `current_date` based on the timestamp of the +document, then compares `current_date` with `state.timestamp_latest`, finally +returns `state.last_doc` from the shard. By using `new HashMap(...)` we copy the +source document, this is important whenever you want to pass the full source +object from one phase to the next. +<3> The `combine_script` returns `state` from each shard. +<4> The `reduce_script` iterates through the value of `s.timestamp_latest` +returned by each shard and returns the document with the latest timestamp +(`last_doc`). In the response, the top hit (in other words, the `latest_doc`) is +nested below the `latest_doc` field. + +Check the +<> +for detailed explanation on the respective scripts. + +You can retrieve the last value in a similar way: + +[source,js] +-------------------------------------------------- +"latest_value": { + "scripted_metric": { + "init_script": "state.timestamp_latest = 0L; state.last_value = ''", + "map_script": """ + def current_date = doc['date'].getValue().toInstant().toEpochMilli(); + if (current_date > state.timestamp_latest) + {state.timestamp_latest = current_date; + state.last_value = params['_source']['value'];} + """, + "combine_script": "return state", + "reduce_script": """ + def last_value = ''; + def timestamp_latest = 0L; + for (s in states) {if (s.timestamp_latest > (timestamp_latest)) + {timestamp_latest = s.timestamp_latest; last_value = s.last_value;}} + return last_value + """ + } +} +-------------------------------------------------- +// NOTCONSOLE + + +[discrete] +[[painless-time-features]] +==== Getting time features as scripted fields + +This snippet shows how to extract time based features by using Painless. The +snippet uses an index where `@timestamp` is defined as a `date` type field. + +[source,js] +-------------------------------------------------- +"script_fields": { + "hour_of_day": { <1> + "script": { + "lang": "painless", + "source": """ + ZonedDateTime date = doc['@timestamp'].value; <2> + return date.getHour(); <3> + """ + } + }, + "month_of_year": { <4> + "script": { + "lang": "painless", + "source": """ + ZonedDateTime date = doc['@timestamp'].value; <5> + return date.getMonthValue(); <6> + """ + } + } + } +-------------------------------------------------- +// NOTCONSOLE + +<1> Contains the Painless script that returns the hour of the day. +<2> Sets `date` based on the timestamp of the document. +<3> Returns the hour value from `date`. +<4> Contains the Painless script that returns the month of the year. +<5> Sets `date` based on the timestamp of the document. +<6> Returns the month value from `date`. + + +[discrete] +[[painless-group-by]] +==== Using Painless in `group_by` + +It is possible to base the `group_by` property of a {transform} on the output of +a script. The following example uses the {kib} sample web logs dataset. The goal +here is to make the {transform} output easier to understand through normalizing +the value of the fields that the data is grouped by. + +[source,console] +-------------------------------------------------- +POST _transform/_preview +{ + "source": { + "index": [ <1> + "kibana_sample_data_logs" + ] + }, + "pivot": { + "group_by": { + "agent": { + "terms": { + "script": { <2> + "source": """String agent = doc['agent.keyword'].value; + if (agent.contains("MSIE")) { + return "internet explorer"; + } else if (agent.contains("AppleWebKit")) { + return "safari"; + } else if (agent.contains('Firefox')) { + return "firefox"; + } else { return agent }""", + "lang": "painless" + } + } + } + }, + "aggregations": { <3> + "200": { + "filter": { + "term": { + "response": "200" + } + } + }, + "404": { + "filter": { + "term": { + "response": "404" + } + } + }, + "503": { + "filter": { + "term": { + "response": "503" + } + } + } + } + }, + "dest": { <4> + "index": "pivot_logs" + } +} +-------------------------------------------------- +// TEST[skip:setup kibana sample data] + +<1> Specifies the source index or indices. +<2> The script defines an `agent` string based on the `agent` field of the +documents, then iterates through the values. If an `agent` field contains +"MSIE", than the script returns "Internet Explorer". If it contains +`AppleWebKit`, it returns "safari". It returns "firefox" if the field value +contains "Firefox". Finally, in every other case, the value of the field is +returned. +<3> The aggregations object contains filters that narrow down the results to +documents that contains `200`, `404`, or `503` values in the `response` field. +<4> Specifies the destination index of the {transform}. + +The API returns the following result: + +[source,js] +-------------------------------------------------- +{ + "preview" : [ + { + "agent" : "firefox", + "200" : 4931, + "404" : 259, + "503" : 172 + }, + { + "agent" : "internet explorer", + "200" : 3674, + "404" : 210, + "503" : 126 + }, + { + "agent" : "safari", + "200" : 4227, + "404" : 332, + "503" : 143 + } + ], + "mappings" : { + "properties" : { + "200" : { + "type" : "long" + }, + "agent" : { + "type" : "keyword" + }, + "404" : { + "type" : "long" + }, + "503" : { + "type" : "long" + } + } + } +} +-------------------------------------------------- +// NOTCONSOLE + +You can see that the `agent` values are simplified so it is easier to interpret +them. The table below shows how normalization modifies the output of the +{transform} in our example compared to the non-normalized values. + +[width="50%"] + +|=== +| Non-normalized `agent` value | Normalized `agent` value + +| "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; .NET CLR 1.1.4322)" | "internet explorer" +| "Mozilla/5.0 (X11; Linux i686) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.696.50 Safari/534.24" | "safari" +| "Mozilla/5.0 (X11; Linux x86_64; rv:6.0a1) Gecko/20110421 Firefox/6.0a1" | "firefox" +|=== + + +[discrete] +[[painless-bucket-script]] +==== Getting duration by using bucket script + +This example shows you how to get the duration of a session by client IP from a +data log by using +{ref}/search-aggregations-pipeline-bucket-script-aggregation.html[bucket script]. +The example uses the {kib} sample web logs dataset. + +[source,console] +-------------------------------------------------- +PUT _data_frame/transforms/data_log +{ + "source": { + "index": "kibana_sample_data_logs" + }, + "dest": { + "index": "data-logs-by-client" + }, + "pivot": { + "group_by": { + "machine.os": {"terms": {"field": "machine.os.keyword"}}, + "machine.ip": {"terms": {"field": "clientip"}} + }, + "aggregations": { + "time_frame.lte": { + "max": { + "field": "timestamp" + } + }, + "time_frame.gte": { + "min": { + "field": "timestamp" + } + }, + "time_length": { <1> + "bucket_script": { + "buckets_path": { <2> + "min": "time_frame.gte.value", + "max": "time_frame.lte.value" + }, + "script": "params.max - params.min" <3> + } + } + } + } +} +-------------------------------------------------- +// TEST[skip:setup kibana sample data] + +<1> To define the length of the sessions, we use a bucket script. +<2> The bucket path is a map of script variables and their associated path to +the buckets you want to use for the variable. In this particular case, `min` and +`max` are variables mapped to `time_frame.gte.value` and `time_frame.lte.value`. +<3> Finally, the script substracts the start date of the session from the end +date which results in the duration of the session. diff --git a/docs/reference/transform/setup.asciidoc b/docs/reference/transform/setup.asciidoc new file mode 100644 index 0000000000000..862b86eeb74c8 --- /dev/null +++ b/docs/reference/transform/setup.asciidoc @@ -0,0 +1,40 @@ +[role="xpack"] +[[transform-setup]] +=== Set up {transforms} +++++ +Setup +++++ + +To use the {transforms}, you must have the +{subscriptions}[appropriate license] and at least one +<> in your {es} cluster. If {stack} +{security-features} are enabled, you must also ensure your users have the +<>. + +[discrete] +[[transform-setup-nodes]] +==== {transform-cap} nodes + +To use {transforms}, there must be at least one node in your cluster with +`xpack.transform.enabled` and `node.transform` set to `true`. By default, all +nodes are {transform} nodes unless you explicitly change these settings or set +`node.data` to `false`. + +If you want to control which nodes run {transforms}, set `node.transform` to +`false` on some nodes. + +For more information, see <> and <>. + +[discrete] +[[transform-privileges]] +==== Security privileges + +The {es} {security-features} provide <> +and <> that make it easier to control +which users can manage or view {transforms}. + +For example, you need `manage_transform` cluster privileges to preview and +create {transforms}. Members of the built-in `transform_admin` role have these +privileges. You also need `read` and `view_index_metadata` index privileges on +the source index and `read`, `create_index`, and `index` privileges on the +destination index. diff --git a/docs/reference/upgrade.asciidoc b/docs/reference/upgrade.asciidoc index 72fffc50ac97a..56b4f1ff43de3 100644 --- a/docs/reference/upgrade.asciidoc +++ b/docs/reference/upgrade.asciidoc @@ -43,10 +43,18 @@ a| [WARNING] ==== -The following upgrade paths are *not* supported: +The upgrade path from 6.8 to 7.0 is *not* supported (both full cluster restart and rolling upgrade). +==== -* 6.8 to 7.0. -* 6.7 to 7.1.–{prev-major-version}. +To upgrade directly to {version} from 6.7 or earlier, you must shut down the +cluster, install {version}, and restart. For more information, see +<>. + +[WARNING] +==== +In-place downgrades to earlier versions are *not* supported. To downgrade to an +earlier version, <> taken prior +to the version upgrade. ==== {es} can read indices created in the previous major version. If you @@ -60,10 +68,6 @@ When upgrading to a new version of {es}, you need to upgrade each of the products in your Elastic Stack. For more information, see the {stack-ref}/upgrading-elastic-stack.html[Elastic Stack Installation and Upgrade Guide]. -To upgrade directly to {version} from 6.6 or earlier, you must shut down the -cluster, install {version}, and restart. For more information, see -<>. - -- include::upgrade/rolling_upgrade.asciidoc[] diff --git a/docs/src/test/java/org/elasticsearch/smoketest/DocsClientYamlTestSuiteIT.java b/docs/src/test/java/org/elasticsearch/smoketest/DocsClientYamlTestSuiteIT.java index 249101cfc54a3..069945f1b7474 100644 --- a/docs/src/test/java/org/elasticsearch/smoketest/DocsClientYamlTestSuiteIT.java +++ b/docs/src/test/java/org/elasticsearch/smoketest/DocsClientYamlTestSuiteIT.java @@ -22,7 +22,6 @@ import com.carrotsearch.randomizedtesting.annotations.Name; import com.carrotsearch.randomizedtesting.annotations.ParametersFactory; import com.carrotsearch.randomizedtesting.annotations.TimeoutSuite; - import org.apache.http.HttpHost; import org.apache.http.util.EntityUtils; import org.apache.lucene.util.BytesRef; @@ -46,6 +45,7 @@ import org.elasticsearch.test.rest.yaml.restspec.ClientYamlSuiteRestSpec; import org.elasticsearch.test.rest.yaml.section.ExecutableSection; import org.junit.After; +import org.junit.Before; import java.io.IOException; import java.util.ArrayList; @@ -102,6 +102,13 @@ protected ClientYamlTestClient initClientYamlTestClient( return new ClientYamlDocsTestClient(restSpec, restClient, hosts, esVersion, masterVersion, this::getClientBuilderWithSniffedHosts); } + @Before + public void waitForRequirements() throws Exception { + if (isCcrTest()) { + ESRestTestCase.waitForActiveLicense(adminClient()); + } + } + @After public void cleanup() throws Exception { if (isMachineLearningTest() || isTransformTest()) { @@ -163,6 +170,11 @@ protected boolean isTransformTest() { return testName != null && (testName.contains("/transform/") || testName.contains("\\transform\\")); } + protected boolean isCcrTest() { + String testName = getTestName(); + return testName != null && testName.contains("/ccr/"); + } + /** * Compares the results of running two analyzers against many random * strings. The goal is to figure out if two anlayzers are "the same" by diff --git a/gradle/ide.gradle b/gradle/ide.gradle new file mode 100644 index 0000000000000..c73d6e5a1bfdd --- /dev/null +++ b/gradle/ide.gradle @@ -0,0 +1,126 @@ +import org.jetbrains.gradle.ext.Remote +import org.jetbrains.gradle.ext.JUnit + +buildscript { + repositories { + maven { + url "https://plugins.gradle.org/m2/" + } + } + dependencies { + classpath "gradle.plugin.org.jetbrains.gradle.plugin.idea-ext:gradle-idea-ext:0.7" + } +} + +apply plugin: org.jetbrains.gradle.ext.IdeaExtPlugin + +allprojects { + apply plugin: 'idea' + + tasks.named('idea') { + doFirst { throw new GradleException("Use of the 'idea' task has been deprecated. For details on importing into IntelliJ see CONTRIBUTING.md.") } + } +} + +tasks.register('configureIdeaGradleJvm') { + group = 'ide' + description = 'Configures the appropriate JVM for Gradle' + + doLast { + modifyXml('.idea/gradle.xml') { xml -> + def gradleSettings = xml.component.find { it.'@name' == 'GradleSettings' }.option[0].GradleProjectSettings + // Remove configured JVM option to force IntelliJ to use the project JDK for Gradle + gradleSettings.option.findAll { it.'@name' == 'gradleJvm' }.each { it.parent().remove(it) } + } + } +} + +idea { + project { + vcs = 'Git' + jdkName = '13' + + settings { + delegateActions { + delegateBuildRunToGradle = false + testRunner = 'choose_per_test' + } + taskTriggers { + afterSync tasks.named('configureIdeaGradleJvm') + } + codeStyle { + java { + classCountToUseImportOnDemand = 999 + } + } + encodings { + encoding = 'UTF-8' + } + compiler { + parallelCompilation = true + javac { + generateDeprecationWarnings = false + } + } + runConfigurations { + 'Debug Elasticsearch'(Remote) { + mode = 'listen' + host = 'localhost' + port = 5005 + } + defaults(JUnit) { + vmParameters = '-ea -Djava.locale.providers=SPI,COMPAT' + } + } + copyright { + useDefault = 'Apache' + scopes = ['x-pack': 'Elastic'] + profiles { + Apache { + keyword = 'Licensed to Elasticsearch under one or more contributor' + notice = '''\ + Licensed to Elasticsearch under one or more contributor + license agreements. See the NOTICE file distributed with + this work for additional information regarding copyright + ownership. Elasticsearch licenses this file to you under + the Apache License, Version 2.0 (the "License"); you may + not use this file except in compliance with the License. + You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + + Unless required by applicable law or agreed to in writing, + software distributed under the License is distributed on an + "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + KIND, either express or implied. See the License for the + specific language governing permissions and limitations + under the License.'''.stripIndent() + } + Elastic { + keyword = 'Licensed under the Elastic License' + notice = '''\ + Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one + or more contributor license agreements. Licensed under the Elastic License; + you may not use this file except in compliance with the Elastic License.'''.stripIndent() + } + } + } + } + } +} + +/** + * Parses a given XML file, applies a set of changes, and writes those changes back to the original file. + * + * @param path Path to existing XML file + * @param action Action to perform on parsed XML document + */ +void modifyXml(Object path, Action action) { + File xmlFile = project.file(path) + Node xml = new XmlParser().parse(xmlFile) + action.execute(xml) + + xmlFile.withPrintWriter { writer -> + new XmlNodePrinter(writer).print(xml) + } +} diff --git a/gradle/runtime-jdk-provision.gradle b/gradle/runtime-jdk-provision.gradle index e05da2d47c5a6..eea8f87822c1a 100644 --- a/gradle/runtime-jdk-provision.gradle +++ b/gradle/runtime-jdk-provision.gradle @@ -1,3 +1,4 @@ +import org.elasticsearch.gradle.Architecture import org.elasticsearch.gradle.OS import org.elasticsearch.gradle.VersionProperties import org.elasticsearch.gradle.info.BuildParams @@ -9,6 +10,7 @@ jdks { vendor = VersionProperties.bundledJdkVendor version = VersionProperties.getBundledJdk(OS.current().name().toLowerCase()) platform = OS.current().name().toLowerCase() + architecture = Architecture.current().name().toLowerCase() } } diff --git a/libs/geo/src/main/java/org/elasticsearch/geometry/LinearRing.java b/libs/geo/src/main/java/org/elasticsearch/geometry/LinearRing.java index c0d0150c3023e..babe02bdf55af 100644 --- a/libs/geo/src/main/java/org/elasticsearch/geometry/LinearRing.java +++ b/libs/geo/src/main/java/org/elasticsearch/geometry/LinearRing.java @@ -64,6 +64,6 @@ public T visit(GeometryVisitor visitor) throws E public String toString() { return "linearring(x=" + Arrays.toString(getX()) + ", y=" + Arrays.toString(getY()) + - (hasZ() ? ", z=" + Arrays.toString(getZ()) : ""); + (hasZ() ? ", z=" + Arrays.toString(getZ()) : "") + ")"; } } diff --git a/libs/x-content/build.gradle b/libs/x-content/build.gradle index 14adf7d102d78..be056980ef50d 100644 --- a/libs/x-content/build.gradle +++ b/libs/x-content/build.gradle @@ -49,6 +49,7 @@ forbiddenApisMain { thirdPartyAudit.ignoreMissingClasses( // from com.fasterxml.jackson.dataformat.yaml.YAMLMapper (jackson-dataformat-yaml) 'com.fasterxml.jackson.databind.ObjectMapper', + 'com.fasterxml.jackson.databind.cfg.MapperBuilder' ) dependencyLicenses { diff --git a/libs/x-content/licenses/jackson-core-2.10.3.jar.sha1 b/libs/x-content/licenses/jackson-core-2.10.3.jar.sha1 new file mode 100644 index 0000000000000..f23937b0d82a4 --- /dev/null +++ b/libs/x-content/licenses/jackson-core-2.10.3.jar.sha1 @@ -0,0 +1 @@ +f7ee7b55c7d292ac72fbaa7648c089f069c938d2 \ No newline at end of file diff --git a/libs/x-content/licenses/jackson-core-2.8.11.jar.sha1 b/libs/x-content/licenses/jackson-core-2.8.11.jar.sha1 deleted file mode 100644 index e7ad1e74ed6b8..0000000000000 --- a/libs/x-content/licenses/jackson-core-2.8.11.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -876ead1db19f0c9e79c9789273a3ef8c6fd6c29b \ No newline at end of file diff --git a/libs/x-content/licenses/jackson-dataformat-cbor-2.10.3.jar.sha1 b/libs/x-content/licenses/jackson-dataformat-cbor-2.10.3.jar.sha1 new file mode 100644 index 0000000000000..6c96c20f48232 --- /dev/null +++ b/libs/x-content/licenses/jackson-dataformat-cbor-2.10.3.jar.sha1 @@ -0,0 +1 @@ +1ba01fef9c3b7ed388d91e71dc733b315c7374cd \ No newline at end of file diff --git a/libs/x-content/licenses/jackson-dataformat-cbor-2.8.11.jar.sha1 b/libs/x-content/licenses/jackson-dataformat-cbor-2.8.11.jar.sha1 deleted file mode 100644 index 378ba524422bc..0000000000000 --- a/libs/x-content/licenses/jackson-dataformat-cbor-2.8.11.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -8b9826e16c3366764bfb7ad7362554f0471046c3 \ No newline at end of file diff --git a/libs/x-content/licenses/jackson-dataformat-smile-2.10.3.jar.sha1 b/libs/x-content/licenses/jackson-dataformat-smile-2.10.3.jar.sha1 new file mode 100644 index 0000000000000..24ac4b32095e4 --- /dev/null +++ b/libs/x-content/licenses/jackson-dataformat-smile-2.10.3.jar.sha1 @@ -0,0 +1 @@ +ff397547ff168e77279a1cd549e2ca4923c991aa \ No newline at end of file diff --git a/libs/x-content/licenses/jackson-dataformat-smile-2.8.11.jar.sha1 b/libs/x-content/licenses/jackson-dataformat-smile-2.8.11.jar.sha1 deleted file mode 100644 index 510afb3df53e6..0000000000000 --- a/libs/x-content/licenses/jackson-dataformat-smile-2.8.11.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -d9d1c49c5d9d5e46e2aee55f3cdd119286fe0fc1 \ No newline at end of file diff --git a/libs/x-content/licenses/jackson-dataformat-yaml-2.10.3.jar.sha1 b/libs/x-content/licenses/jackson-dataformat-yaml-2.10.3.jar.sha1 new file mode 100644 index 0000000000000..8f1ecab9ecb38 --- /dev/null +++ b/libs/x-content/licenses/jackson-dataformat-yaml-2.10.3.jar.sha1 @@ -0,0 +1 @@ +4dc1a172812d9da27c1afd6a08f4f12aad7b14dd \ No newline at end of file diff --git a/libs/x-content/licenses/jackson-dataformat-yaml-2.8.11.jar.sha1 b/libs/x-content/licenses/jackson-dataformat-yaml-2.8.11.jar.sha1 deleted file mode 100644 index 78a68d715ec3d..0000000000000 --- a/libs/x-content/licenses/jackson-dataformat-yaml-2.8.11.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -2e77c6ff7342cd61ab1ae7cb14ed16aebfc8a72a \ No newline at end of file diff --git a/libs/x-content/licenses/snakeyaml-1.17.jar.sha1 b/libs/x-content/licenses/snakeyaml-1.17.jar.sha1 deleted file mode 100644 index 9ac6e87f2244a..0000000000000 --- a/libs/x-content/licenses/snakeyaml-1.17.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -7a27ea250c5130b2922b86dea63cbb1cc10a660c \ No newline at end of file diff --git a/libs/x-content/licenses/snakeyaml-1.24.jar.sha1 b/libs/x-content/licenses/snakeyaml-1.24.jar.sha1 new file mode 100644 index 0000000000000..a6e7bd5a1e7c1 --- /dev/null +++ b/libs/x-content/licenses/snakeyaml-1.24.jar.sha1 @@ -0,0 +1 @@ +13a9c0d6776483c3876e3ff9384f9bb55b17001b \ No newline at end of file diff --git a/libs/x-content/src/main/java/org/elasticsearch/common/ParseField.java b/libs/x-content/src/main/java/org/elasticsearch/common/ParseField.java index 084d82372c0ce..3872ef4852274 100644 --- a/libs/x-content/src/main/java/org/elasticsearch/common/ParseField.java +++ b/libs/x-content/src/main/java/org/elasticsearch/common/ParseField.java @@ -19,11 +19,13 @@ package org.elasticsearch.common; import org.elasticsearch.common.xcontent.DeprecationHandler; +import org.elasticsearch.common.xcontent.XContentLocation; import java.util.Collections; import java.util.HashSet; import java.util.Objects; import java.util.Set; +import java.util.function.Supplier; /** * Holds a field that can be found in a request while parsing and its different @@ -34,6 +36,7 @@ public class ParseField { private final String[] deprecatedNames; private String allReplacedWith = null; private final String[] allNames; + private boolean fullyDeprecated = false; private static final String[] EMPTY = new String[0]; @@ -96,6 +99,15 @@ public ParseField withAllDeprecated(String allReplacedWith) { return parseField; } + /** + * Return a new ParseField where all field names are deprecated with no replacement + */ + public ParseField withAllDeprecated() { + ParseField parseField = this.withDeprecation(getAllNamesIncludedDeprecated()); + parseField.fullyDeprecated = true; + return parseField; + } + /** * Does {@code fieldName} match this field? * @param fieldName @@ -105,10 +117,26 @@ public ParseField withAllDeprecated(String allReplacedWith) { * names for this {@link ParseField}. */ public boolean match(String fieldName, DeprecationHandler deprecationHandler) { + return match(null, () -> XContentLocation.UNKNOWN, fieldName, deprecationHandler); + } + + /** + * Does {@code fieldName} match this field? + * @param parserName + * the name of the parent object holding this field + * @param location + * the XContentLocation of the field + * @param fieldName + * the field name to match against this {@link ParseField} + * @param deprecationHandler called if {@code fieldName} is deprecated + * @return true if fieldName matches any of the acceptable + * names for this {@link ParseField}. + */ + public boolean match(String parserName, Supplier location, String fieldName, DeprecationHandler deprecationHandler) { Objects.requireNonNull(fieldName, "fieldName cannot be null"); // if this parse field has not been completely deprecated then try to // match the preferred name - if (allReplacedWith == null && fieldName.equals(name)) { + if (fullyDeprecated == false && allReplacedWith == null && fieldName.equals(name)) { return true; } // Now try to match against one of the deprecated names. Note that if @@ -116,10 +144,12 @@ public boolean match(String fieldName, DeprecationHandler deprecationHandler) { // fields will be in the deprecatedNames array for (String depName : deprecatedNames) { if (fieldName.equals(depName)) { - if (allReplacedWith == null) { - deprecationHandler.usedDeprecatedName(fieldName, name); + if (fullyDeprecated) { + deprecationHandler.usedDeprecatedField(parserName, location, fieldName); + } else if (allReplacedWith == null) { + deprecationHandler.usedDeprecatedName(parserName, location, fieldName, name); } else { - deprecationHandler.usedDeprecatedField(fieldName, allReplacedWith); + deprecationHandler.usedDeprecatedField(parserName, location, fieldName, allReplacedWith); } return true; } diff --git a/libs/x-content/src/main/java/org/elasticsearch/common/xcontent/DeprecationHandler.java b/libs/x-content/src/main/java/org/elasticsearch/common/xcontent/DeprecationHandler.java index 1b0dcf4568086..65f52943f0b86 100644 --- a/libs/x-content/src/main/java/org/elasticsearch/common/xcontent/DeprecationHandler.java +++ b/libs/x-content/src/main/java/org/elasticsearch/common/xcontent/DeprecationHandler.java @@ -19,6 +19,8 @@ package org.elasticsearch.common.xcontent; +import java.util.function.Supplier; + /** * Callback for notifying the creator of the {@link XContentParser} that * parsing hit a deprecated field. @@ -32,14 +34,55 @@ public interface DeprecationHandler { */ DeprecationHandler THROW_UNSUPPORTED_OPERATION = new DeprecationHandler() { @Override - public void usedDeprecatedField(String usedName, String replacedWith) { - throw new UnsupportedOperationException("deprecated fields not supported here but got [" - + usedName + "] which is a deprecated name for [" + replacedWith + "]"); + public void usedDeprecatedField(String parserName, Supplier location, String usedName, String replacedWith) { + if (parserName != null) { + throw new UnsupportedOperationException("deprecated fields not supported in [" + parserName + "] but got [" + + usedName + "] at [" + location.get() + "] which is a deprecated name for [" + replacedWith + "]"); + } else { + throw new UnsupportedOperationException("deprecated fields not supported here but got [" + + usedName + "] which is a deprecated name for [" + replacedWith + "]"); + } + } + @Override + public void usedDeprecatedName(String parserName, Supplier location, String usedName, String modernName) { + if (parserName != null) { + throw new UnsupportedOperationException("deprecated fields not supported in [" + parserName + "] but got [" + + usedName + "] at [" + location.get() + "] which has been replaced with [" + modernName + "]"); + } else { + throw new UnsupportedOperationException("deprecated fields not supported here but got [" + + usedName + "] which has been replaced with [" + modernName + "]"); + } } + @Override - public void usedDeprecatedName(String usedName, String modernName) { - throw new UnsupportedOperationException("deprecated fields not supported here but got [" - + usedName + "] which has been replaced with [" + modernName + "]"); + public void usedDeprecatedField(String parserName, Supplier location, String usedName) { + if (parserName != null) { + throw new UnsupportedOperationException("deprecated fields not supported in [" + parserName + "] but got [" + + usedName + "] at [" + location.get() + "] which has been deprecated entirely"); + } else { + throw new UnsupportedOperationException("deprecated fields not supported here but got [" + + usedName + "] which has been deprecated entirely"); + } + } + }; + + /** + * Ignores all deprecations + */ + DeprecationHandler IGNORE_DEPRECATIONS = new DeprecationHandler() { + @Override + public void usedDeprecatedName(String parserName, Supplier location, String usedName, String modernName) { + + } + + @Override + public void usedDeprecatedField(String parserName, Supplier location, String usedName, String replacedWith) { + + } + + @Override + public void usedDeprecatedField(String parserName, Supplier location, String usedName) { + } }; @@ -48,13 +91,21 @@ public void usedDeprecatedName(String usedName, String modernName) { * @param usedName the provided field name * @param modernName the modern name for the field */ - void usedDeprecatedName(String usedName, String modernName); + void usedDeprecatedName(String parserName, Supplier location, String usedName, String modernName); /** * Called when the provided field name matches the current field but the entire - * field has been marked as deprecated. + * field has been marked as deprecated and another field should be used * @param usedName the provided field name * @param replacedWith the name of the field that replaced this field */ - void usedDeprecatedField(String usedName, String replacedWith); + void usedDeprecatedField(String parserName, Supplier location, String usedName, String replacedWith); + + /** + * Called when the provided field name matches the current field but the entire + * field has been marked as deprecated with no replacement + * @param usedName the provided field name + */ + void usedDeprecatedField(String parserName, Supplier location, String usedName); + } diff --git a/libs/x-content/src/main/java/org/elasticsearch/common/xcontent/ObjectParser.java b/libs/x-content/src/main/java/org/elasticsearch/common/xcontent/ObjectParser.java index b11f6afd4bb26..73bbf812f598c 100644 --- a/libs/x-content/src/main/java/org/elasticsearch/common/xcontent/ObjectParser.java +++ b/libs/x-content/src/main/java/org/elasticsearch/common/xcontent/ObjectParser.java @@ -571,7 +571,7 @@ private class FieldParser { } void assertSupports(String parserName, XContentParser parser, String currentFieldName) { - if (parseField.match(currentFieldName, parser.getDeprecationHandler()) == false) { + if (parseField.match(parserName, parser::getTokenLocation, currentFieldName, parser.getDeprecationHandler()) == false) { throw new XContentParseException(parser.getTokenLocation(), "[" + parserName + "] parsefield doesn't accept: " + currentFieldName); } diff --git a/libs/x-content/src/main/java/org/elasticsearch/common/xcontent/XContentLocation.java b/libs/x-content/src/main/java/org/elasticsearch/common/xcontent/XContentLocation.java index 43ab7503cd1dd..1d5bfd6c2c429 100644 --- a/libs/x-content/src/main/java/org/elasticsearch/common/xcontent/XContentLocation.java +++ b/libs/x-content/src/main/java/org/elasticsearch/common/xcontent/XContentLocation.java @@ -25,7 +25,10 @@ * position of a parsing error to end users and consequently have line and * column numbers starting from 1. */ -public class XContentLocation { +public final class XContentLocation { + + public static final XContentLocation UNKNOWN = new XContentLocation(-1, -1); + public final int lineNumber; public final int columnNumber; diff --git a/libs/x-content/src/test/java/org/elasticsearch/common/ParseFieldTests.java b/libs/x-content/src/test/java/org/elasticsearch/common/ParseFieldTests.java index 72ba5578a49cb..e7420356c8fac 100644 --- a/libs/x-content/src/test/java/org/elasticsearch/common/ParseFieldTests.java +++ b/libs/x-content/src/test/java/org/elasticsearch/common/ParseFieldTests.java @@ -60,6 +60,21 @@ public void testAllDeprecated() { assertWarnings("Deprecated field [like_text] used, replaced by [like]"); } + public void testDeprecatedWithNoReplacement() { + String name = "dep"; + String[] alternatives = new String[]{"old_dep", "new_dep"}; + ParseField field = new ParseField(name).withDeprecation(alternatives).withAllDeprecated(); + assertFalse(field.match("not a field name", LoggingDeprecationHandler.INSTANCE)); + assertTrue(field.match("dep", LoggingDeprecationHandler.INSTANCE)); + assertWarnings("Deprecated field [dep] used, this field is unused and will be removed entirely"); + assertTrue(field.match("old_dep", LoggingDeprecationHandler.INSTANCE)); + assertWarnings("Deprecated field [old_dep] used, this field is unused and will be removed entirely"); + assertTrue(field.match("new_dep", LoggingDeprecationHandler.INSTANCE)); + assertWarnings("Deprecated field [new_dep] used, this field is unused and will be removed entirely"); + + + } + public void testGetAllNamesIncludedDeprecated() { ParseField parseField = new ParseField("terms", "in"); assertThat(parseField.getAllNamesIncludedDeprecated(), arrayContainingInAnyOrder("terms", "in")); diff --git a/libs/x-content/src/test/java/org/elasticsearch/common/xcontent/MapXContentParserTests.java b/libs/x-content/src/test/java/org/elasticsearch/common/xcontent/MapXContentParserTests.java index 0d2113152ebe8..b3fb9eee4662d 100644 --- a/libs/x-content/src/test/java/org/elasticsearch/common/xcontent/MapXContentParserTests.java +++ b/libs/x-content/src/test/java/org/elasticsearch/common/xcontent/MapXContentParserTests.java @@ -25,6 +25,7 @@ import org.elasticsearch.test.ESTestCase; import java.io.IOException; +import java.util.EnumSet; import java.util.Map; import static org.elasticsearch.common.xcontent.XContentParserTests.generateRandomObject; @@ -73,7 +74,13 @@ public void testRandomObject() throws IOException { } public void compareTokens(CheckedConsumer consumer) throws IOException { - final XContentType xContentType = randomFrom(XContentType.values()); + for (XContentType xContentType : EnumSet.allOf(XContentType.class)) { + logger.info("--> testing with xcontent type: {}", xContentType); + compareTokens(consumer, xContentType); + } + } + + public void compareTokens(CheckedConsumer consumer, XContentType xContentType) throws IOException { try (XContentBuilder builder = XContentBuilder.builder(xContentType.xContent())) { consumer.accept(builder); final Map map; @@ -94,7 +101,13 @@ public void compareTokens(CheckedConsumer consumer assertEquals(token, mapToken); assertEquals(parser.currentName(), mapParser.currentName()); if (token != null && (token.isValue() || token == XContentParser.Token.VALUE_NULL)) { - assertEquals(parser.textOrNull(), mapParser.textOrNull()); + if (xContentType != XContentType.YAML || token != XContentParser.Token.VALUE_EMBEDDED_OBJECT) { + // YAML struggles with converting byte arrays into text, because it + // does weird base64 decoding to the values. We don't do this + // weirdness in the MapXContentParser, so don't try to stringify it. + // The .binaryValue() comparison below still works correctly though. + assertEquals(parser.textOrNull(), mapParser.textOrNull()); + } switch (token) { case VALUE_STRING: assertEquals(parser.text(), mapParser.text()); diff --git a/libs/x-content/src/test/java/org/elasticsearch/common/xcontent/ObjectParserTests.java b/libs/x-content/src/test/java/org/elasticsearch/common/xcontent/ObjectParserTests.java index c99d1b10d6a4d..5eca7d527730b 100644 --- a/libs/x-content/src/test/java/org/elasticsearch/common/xcontent/ObjectParserTests.java +++ b/libs/x-content/src/test/java/org/elasticsearch/common/xcontent/ObjectParserTests.java @@ -223,7 +223,7 @@ class TestStruct { objectParser.declareField((i, v, c) -> v.test = i.text(), new ParseField("test", "old_test"), ObjectParser.ValueType.STRING); objectParser.parse(parser, s, null); assertEquals("foo", s.test); - assertWarnings("Deprecated field [old_test] used, expected [test] instead"); + assertWarnings(false, "[foo][1:15] Deprecated field [old_test] used, expected [test] instead"); } public void testFailOnValueType() throws IOException { diff --git a/libs/x-content/src/test/java/org/elasticsearch/common/xcontent/XContentParserTests.java b/libs/x-content/src/test/java/org/elasticsearch/common/xcontent/XContentParserTests.java index 0cfa01876c590..66c0041723e8c 100644 --- a/libs/x-content/src/test/java/org/elasticsearch/common/xcontent/XContentParserTests.java +++ b/libs/x-content/src/test/java/org/elasticsearch/common/xcontent/XContentParserTests.java @@ -78,14 +78,17 @@ public void testFloat() throws IOException { assertEquals(value, number.floatValue(), 0.0f); - if (xContentType == XContentType.CBOR) { - // CBOR parses back a float - assertTrue(number instanceof Float); - } else { - // JSON, YAML and SMILE parses back the float value as a double - // This will change for SMILE in Jackson 2.9 where all binary based - // formats will return a float - assertTrue(number instanceof Double); + switch (xContentType) { + case CBOR: + case SMILE: + assertThat(number, instanceOf(Float.class)); + break; + case JSON: + case YAML: + assertThat(number, instanceOf(Double.class)); + break; + default: + throw new AssertionError("unexpected x-content type [" + xContentType + "]"); } } } diff --git a/modules/aggs-matrix-stats/build.gradle b/modules/aggs-matrix-stats/build.gradle index b397edf2cbb5c..ffe8145eba341 100644 --- a/modules/aggs-matrix-stats/build.gradle +++ b/modules/aggs-matrix-stats/build.gradle @@ -21,3 +21,9 @@ esplugin { description 'Adds aggregations whose input are a list of numeric fields and output includes a matrix.' classname 'org.elasticsearch.search.aggregations.matrix.MatrixAggregationPlugin' } + +restResources { + restApi { + includeCore '_common', 'indices', 'cluster', 'index', 'search', 'nodes' + } +} diff --git a/modules/analysis-common/build.gradle b/modules/analysis-common/build.gradle index 773c645f698cc..24e6e651c3fd9 100644 --- a/modules/analysis-common/build.gradle +++ b/modules/analysis-common/build.gradle @@ -23,6 +23,12 @@ esplugin { extendedPlugins = ['lang-painless'] } +restResources { + restApi { + includeCore '_common', 'indices', 'index', 'cluster', 'search', 'nodes', 'bulk', 'termvectors', 'explain', 'count' + } +} + dependencies { compileOnly project(':modules:lang-painless') } diff --git a/modules/ingest-common/build.gradle b/modules/ingest-common/build.gradle index 6590bdc1c52ef..2ff50f1543ed9 100644 --- a/modules/ingest-common/build.gradle +++ b/modules/ingest-common/build.gradle @@ -29,6 +29,12 @@ dependencies { compile project(':libs:elasticsearch-dissect') } +restResources { + restApi { + includeCore '_common', 'ingest', 'cluster', 'indices', 'index', 'bulk', 'nodes', 'get', 'update', 'cat', 'mget' + } +} + testClusters.integTest { // Needed in order to test ingest pipeline templating: // (this is because the integTest node is not using default distribution, but only the minimal number of required modules) diff --git a/modules/ingest-common/src/main/java/org/elasticsearch/ingest/common/AbstractStringProcessor.java b/modules/ingest-common/src/main/java/org/elasticsearch/ingest/common/AbstractStringProcessor.java index 546519aa5f606..ded75e95be73d 100644 --- a/modules/ingest-common/src/main/java/org/elasticsearch/ingest/common/AbstractStringProcessor.java +++ b/modules/ingest-common/src/main/java/org/elasticsearch/ingest/common/AbstractStringProcessor.java @@ -24,6 +24,8 @@ import org.elasticsearch.ingest.IngestDocument; import org.elasticsearch.ingest.Processor; +import java.util.ArrayList; +import java.util.List; import java.util.Map; /** @@ -58,7 +60,8 @@ String getTargetField() { @Override public final IngestDocument execute(IngestDocument document) { - String val = document.getFieldValue(field, String.class, ignoreMissing); + Object val = document.getFieldValue(field, Object.class, ignoreMissing); + Object newValue; if (val == null && ignoreMissing) { return document; @@ -66,7 +69,29 @@ public final IngestDocument execute(IngestDocument document) { throw new IllegalArgumentException("field [" + field + "] is null, cannot process it."); } - document.setFieldValue(targetField, process(val)); + if (val instanceof List) { + List list = (List) val; + List newList = new ArrayList<>(list.size()); + for (Object value : list) { + if (value instanceof String) { + newList.add(process((String) value)); + } else { + throw new IllegalArgumentException("value [" + value + "] of type [" + value.getClass().getName() + + "] in list field [" + field + "] cannot be cast to [" + String.class.getName() + "]"); + } + } + newValue = newList; + } else { + if (val instanceof String) { + newValue = process((String) val); + } else { + throw new IllegalArgumentException("field [" + field + "] of type [" + val.getClass().getName() + "] cannot be cast to [" + + String.class.getName() + "]"); + } + + } + + document.setFieldValue(targetField, newValue); return document; } diff --git a/modules/ingest-common/src/test/java/org/elasticsearch/ingest/common/AbstractStringProcessorTestCase.java b/modules/ingest-common/src/test/java/org/elasticsearch/ingest/common/AbstractStringProcessorTestCase.java index f667f84e5d7b1..b9c83be40fff7 100644 --- a/modules/ingest-common/src/test/java/org/elasticsearch/ingest/common/AbstractStringProcessorTestCase.java +++ b/modules/ingest-common/src/test/java/org/elasticsearch/ingest/common/AbstractStringProcessorTestCase.java @@ -24,8 +24,10 @@ import org.elasticsearch.ingest.RandomDocumentPicks; import org.elasticsearch.test.ESTestCase; +import java.util.ArrayList; import java.util.Collections; import java.util.HashMap; +import java.util.List; import static org.elasticsearch.ingest.IngestDocumentMatcher.assertIngestDocument; import static org.hamcrest.Matchers.containsString; @@ -41,7 +43,7 @@ protected String modifyInput(String input) { protected abstract T expectedResult(String input); - protected Class expectedResultType(){ + protected Class expectedResultType() { return String.class; // most results types are Strings } @@ -52,6 +54,19 @@ public void testProcessor() throws Exception { Processor processor = newProcessor(fieldName, randomBoolean(), fieldName); processor.execute(ingestDocument); assertThat(ingestDocument.getFieldValue(fieldName, expectedResultType()), equalTo(expectedResult(fieldValue))); + + int numItems = randomIntBetween(1, 10); + List fieldValueList = new ArrayList<>(); + List expectedList = new ArrayList<>(); + for (int i = 0; i < numItems; i++) { + String randomString = RandomDocumentPicks.randomString(random()); + fieldValueList.add(modifyInput(randomString)); + expectedList.add(expectedResult(randomString)); + } + String multiValueFieldName = RandomDocumentPicks.addRandomField(random(), ingestDocument, fieldValueList); + Processor multiValueProcessor = newProcessor(multiValueFieldName, randomBoolean(), multiValueFieldName); + multiValueProcessor.execute(ingestDocument); + assertThat(ingestDocument.getFieldValue(multiValueFieldName, List.class), equalTo(expectedList)); } public void testFieldNotFound() throws Exception { @@ -94,6 +109,14 @@ public void testNonStringValue() throws Exception { Exception e = expectThrows(Exception.class, () -> processor.execute(ingestDocument)); assertThat(e.getMessage(), equalTo("field [" + fieldName + "] of type [java.lang.Integer] cannot be cast to [java.lang.String]")); + + List fieldValueList = new ArrayList<>(); + int randomValue = randomInt(); + fieldValueList.add(randomValue); + ingestDocument.setFieldValue(fieldName, fieldValueList); + Exception exception = expectThrows(Exception.class, () -> processor.execute(ingestDocument)); + assertThat(exception.getMessage(), equalTo("value [" + randomValue + "] of type [java.lang.Integer] in list field [" + fieldName + + "] cannot be cast to [java.lang.String]")); } public void testNonStringValueWithIgnoreMissing() throws Exception { @@ -104,6 +127,14 @@ public void testNonStringValueWithIgnoreMissing() throws Exception { Exception e = expectThrows(Exception.class, () -> processor.execute(ingestDocument)); assertThat(e.getMessage(), equalTo("field [" + fieldName + "] of type [java.lang.Integer] cannot be cast to [java.lang.String]")); + + List fieldValueList = new ArrayList<>(); + int randomValue = randomInt(); + fieldValueList.add(randomValue); + ingestDocument.setFieldValue(fieldName, fieldValueList); + Exception exception = expectThrows(Exception.class, () -> processor.execute(ingestDocument)); + assertThat(exception.getMessage(), equalTo("value [" + randomValue + "] of type [java.lang.Integer] in list field [" + fieldName + + "] cannot be cast to [java.lang.String]")); } public void testTargetField() throws Exception { diff --git a/modules/ingest-common/src/test/java/org/elasticsearch/ingest/common/JsonProcessorTests.java b/modules/ingest-common/src/test/java/org/elasticsearch/ingest/common/JsonProcessorTests.java index 099e8e1866b8e..c9e81ccf4c811 100644 --- a/modules/ingest-common/src/test/java/org/elasticsearch/ingest/common/JsonProcessorTests.java +++ b/modules/ingest-common/src/test/java/org/elasticsearch/ingest/common/JsonProcessorTests.java @@ -65,7 +65,7 @@ public void testInvalidValue() { Exception exception = expectThrows(IllegalArgumentException.class, () -> jsonProcessor.execute(ingestDocument)); assertThat(exception.getCause().getMessage(), containsString("Unrecognized token 'blah': " + - "was expecting ('true', 'false' or 'null')")); + "was expecting (JSON String, Number, Array, Object or token 'null', 'true' or 'false')")); } public void testByteArray() { @@ -75,7 +75,12 @@ public void testByteArray() { IngestDocument ingestDocument = RandomDocumentPicks.randomIngestDocument(random(), document); Exception exception = expectThrows(IllegalArgumentException.class, () -> jsonProcessor.execute(ingestDocument)); - assertThat(exception.getCause().getMessage(), containsString("Unrecognized token 'B': was expecting ('true', 'false' or 'null')")); + assertThat( + exception.getCause().getMessage(), + containsString( + "Unrecognized token 'B': was expecting (JSON String, Number, Array, Object or token 'null', 'true' or 'false')" + ) + ); } public void testNull() throws Exception { diff --git a/modules/ingest-geoip/build.gradle b/modules/ingest-geoip/build.gradle index 6789f96d883b8..5d200bf87b55a 100644 --- a/modules/ingest-geoip/build.gradle +++ b/modules/ingest-geoip/build.gradle @@ -25,17 +25,21 @@ esplugin { } dependencies { - // Upgrade to 2.10.0 or higher when jackson-core gets upgraded to 2.9.x. Blocked by #27032 - compile('com.maxmind.geoip2:geoip2:2.9.0') + compile('com.maxmind.geoip2:geoip2:2.13.1') // geoip2 dependencies: - // do not hardcode this to the version in version.properties, it needs to be upgraded separately with geoip2 - compile("com.fasterxml.jackson.core:jackson-annotations:2.8.11") - compile("com.fasterxml.jackson.core:jackson-databind:2.8.11.6") - compile('com.maxmind.db:maxmind-db:1.2.2') + compile("com.fasterxml.jackson.core:jackson-annotations:${versions.jackson}") + compile("com.fasterxml.jackson.core:jackson-databind:${versions.jackson}") + compile('com.maxmind.db:maxmind-db:1.3.1') testCompile 'org.elasticsearch:geolite2-databases:20191119' } +restResources { + restApi { + includeCore '_common', 'indices', 'index', 'cluster', 'nodes', 'get', 'ingest' + } +} + task copyDefaultGeoIp2DatabaseFiles(type: Copy) { from { zipTree(configurations.testCompile.files.find { it.name.contains('geolite2-databases') }) } into "${project.buildDir}/ingest-geoip" diff --git a/modules/ingest-geoip/licenses/geoip2-2.13.1.jar.sha1 b/modules/ingest-geoip/licenses/geoip2-2.13.1.jar.sha1 new file mode 100644 index 0000000000000..253d9f12e7a3a --- /dev/null +++ b/modules/ingest-geoip/licenses/geoip2-2.13.1.jar.sha1 @@ -0,0 +1 @@ +f27d1a49d5a29dd4a7ac5006ce2eb16b8b9bb888 \ No newline at end of file diff --git a/modules/ingest-geoip/licenses/geoip2-2.9.0.jar.sha1 b/modules/ingest-geoip/licenses/geoip2-2.9.0.jar.sha1 deleted file mode 100644 index 8cb79bbb9207a..0000000000000 --- a/modules/ingest-geoip/licenses/geoip2-2.9.0.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -c12b463a2c10824225c0b27952c49b464cb0e4c6 \ No newline at end of file diff --git a/modules/ingest-geoip/licenses/jackson-annotations-2.10.3.jar.sha1 b/modules/ingest-geoip/licenses/jackson-annotations-2.10.3.jar.sha1 new file mode 100644 index 0000000000000..9c725f2d90e69 --- /dev/null +++ b/modules/ingest-geoip/licenses/jackson-annotations-2.10.3.jar.sha1 @@ -0,0 +1 @@ +0f63b3b1da563767d04d2e4d3fc1ae0cdeffebe7 \ No newline at end of file diff --git a/modules/ingest-geoip/licenses/jackson-annotations-2.8.11.jar.sha1 b/modules/ingest-geoip/licenses/jackson-annotations-2.8.11.jar.sha1 deleted file mode 100644 index 30e7d1a7b1a74..0000000000000 --- a/modules/ingest-geoip/licenses/jackson-annotations-2.8.11.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -391de20b4e29cb3fb07d2454ace64be2c82ac91f \ No newline at end of file diff --git a/modules/ingest-geoip/licenses/jackson-databind-2.10.3.jar.sha1 b/modules/ingest-geoip/licenses/jackson-databind-2.10.3.jar.sha1 new file mode 100644 index 0000000000000..688ae92d10792 --- /dev/null +++ b/modules/ingest-geoip/licenses/jackson-databind-2.10.3.jar.sha1 @@ -0,0 +1 @@ +aae92628b5447fa25af79871ca98668da6edd439 \ No newline at end of file diff --git a/modules/ingest-geoip/licenses/jackson-databind-2.8.11.6.jar.sha1 b/modules/ingest-geoip/licenses/jackson-databind-2.8.11.6.jar.sha1 deleted file mode 100644 index f491259db56bc..0000000000000 --- a/modules/ingest-geoip/licenses/jackson-databind-2.8.11.6.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -35753201d0cdb1dbe998ab289bca1180b68d4368 \ No newline at end of file diff --git a/modules/ingest-geoip/licenses/maxmind-db-1.2.2.jar.sha1 b/modules/ingest-geoip/licenses/maxmind-db-1.2.2.jar.sha1 deleted file mode 100644 index 3dd3ad36f4cc3..0000000000000 --- a/modules/ingest-geoip/licenses/maxmind-db-1.2.2.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -78c22a03de1e222b0751855aff7bb6e6db5569e5 \ No newline at end of file diff --git a/modules/ingest-geoip/licenses/maxmind-db-1.3.1.jar.sha1 b/modules/ingest-geoip/licenses/maxmind-db-1.3.1.jar.sha1 new file mode 100644 index 0000000000000..aebff2c3a849c --- /dev/null +++ b/modules/ingest-geoip/licenses/maxmind-db-1.3.1.jar.sha1 @@ -0,0 +1 @@ +211bca628225bc0f719051b16deb03a747d7a14f \ No newline at end of file diff --git a/modules/ingest-user-agent/build.gradle b/modules/ingest-user-agent/build.gradle index b3e17d58360d4..d45b30c9f1daf 100644 --- a/modules/ingest-user-agent/build.gradle +++ b/modules/ingest-user-agent/build.gradle @@ -22,6 +22,12 @@ esplugin { classname 'org.elasticsearch.ingest.useragent.IngestUserAgentPlugin' } +restResources { + restApi { + includeCore '_common', 'indices', 'index', 'cluster', 'nodes', 'get', 'ingest' + } +} + testClusters.integTest { extraConfigFile 'ingest-user-agent/test-regexes.yml', file('src/test/test-regexes.yml') } diff --git a/modules/kibana/src/main/java/org/elasticsearch/kibana/KibanaPlugin.java b/modules/kibana/src/main/java/org/elasticsearch/kibana/KibanaPlugin.java deleted file mode 100644 index f917c477493cc..0000000000000 --- a/modules/kibana/src/main/java/org/elasticsearch/kibana/KibanaPlugin.java +++ /dev/null @@ -1,145 +0,0 @@ -/* - * Licensed to Elasticsearch under one or more contributor - * license agreements. See the NOTICE file distributed with - * this work for additional information regarding copyright - * ownership. Elasticsearch licenses this file to you under - * the Apache License, Version 2.0 (the "License"); you may - * not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, - * software distributed under the License is distributed on an - * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY - * KIND, either express or implied. See the License for the - * specific language governing permissions and limitations - * under the License. - */ - -package org.elasticsearch.kibana; - -import org.elasticsearch.client.node.NodeClient; -import org.elasticsearch.cluster.metadata.IndexNameExpressionResolver; -import org.elasticsearch.cluster.node.DiscoveryNodes; -import org.elasticsearch.common.settings.ClusterSettings; -import org.elasticsearch.common.settings.IndexScopedSettings; -import org.elasticsearch.common.settings.Setting; -import org.elasticsearch.common.settings.Setting.Property; -import org.elasticsearch.common.settings.Settings; -import org.elasticsearch.common.settings.SettingsFilter; -import org.elasticsearch.index.reindex.RestDeleteByQueryAction; -import org.elasticsearch.indices.SystemIndexDescriptor; -import org.elasticsearch.plugins.Plugin; -import org.elasticsearch.plugins.SystemIndexPlugin; -import org.elasticsearch.rest.BaseRestHandler; -import org.elasticsearch.rest.RestController; -import org.elasticsearch.rest.RestHandler; -import org.elasticsearch.rest.RestRequest; -import org.elasticsearch.rest.action.admin.indices.RestCreateIndexAction; -import org.elasticsearch.rest.action.admin.indices.RestGetAliasesAction; -import org.elasticsearch.rest.action.admin.indices.RestGetIndicesAction; -import org.elasticsearch.rest.action.admin.indices.RestIndexPutAliasAction; -import org.elasticsearch.rest.action.admin.indices.RestRefreshAction; -import org.elasticsearch.rest.action.admin.indices.RestUpdateSettingsAction; -import org.elasticsearch.rest.action.document.RestBulkAction; -import org.elasticsearch.rest.action.document.RestDeleteAction; -import org.elasticsearch.rest.action.document.RestGetAction; -import org.elasticsearch.rest.action.document.RestIndexAction; -import org.elasticsearch.rest.action.document.RestIndexAction.AutoIdHandler; -import org.elasticsearch.rest.action.document.RestIndexAction.CreateHandler; -import org.elasticsearch.rest.action.document.RestMultiGetAction; -import org.elasticsearch.rest.action.document.RestUpdateAction; -import org.elasticsearch.rest.action.search.RestClearScrollAction; -import org.elasticsearch.rest.action.search.RestSearchAction; -import org.elasticsearch.rest.action.search.RestSearchScrollAction; - -import java.io.IOException; -import java.util.Collection; -import java.util.List; -import java.util.function.Function; -import java.util.function.Supplier; -import java.util.stream.Collectors; - -public class KibanaPlugin extends Plugin implements SystemIndexPlugin { - - public static final Setting> KIBANA_INDEX_NAMES_SETTING = Setting.listSetting("kibana.system_indices", - List.of(".kibana*", ".reporting"), Function.identity(), Property.NodeScope); - - @Override - public Collection getSystemIndexDescriptors(Settings settings) { - return KIBANA_INDEX_NAMES_SETTING.get(settings).stream() - .map(pattern -> new SystemIndexDescriptor(pattern, "System index used by kibana")) - .collect(Collectors.toUnmodifiableList()); - } - - @Override - public List getRestHandlers(Settings settings, RestController restController, ClusterSettings clusterSettings, - IndexScopedSettings indexScopedSettings, SettingsFilter settingsFilter, - IndexNameExpressionResolver indexNameExpressionResolver, - Supplier nodesInCluster) { - // TODO need to figure out what subset of system indices Kibana should have access to via these APIs - final List allowedIndexPatterns = List.of(); - return List.of( - // Based on https://github.com/elastic/kibana/issues/49764 - // apis needed to perform migrations... ideally these will go away - new KibanaWrappedRestHandler(new RestCreateIndexAction(), allowedIndexPatterns), - new KibanaWrappedRestHandler(new RestGetAliasesAction(), allowedIndexPatterns), - new KibanaWrappedRestHandler(new RestIndexPutAliasAction(), allowedIndexPatterns), - new KibanaWrappedRestHandler(new RestRefreshAction(), allowedIndexPatterns), - - // apis needed to access saved objects - new KibanaWrappedRestHandler(new RestGetAction(), allowedIndexPatterns), - new KibanaWrappedRestHandler(new RestMultiGetAction(settings), allowedIndexPatterns), - new KibanaWrappedRestHandler(new RestSearchAction(), allowedIndexPatterns), - new KibanaWrappedRestHandler(new RestBulkAction(settings), allowedIndexPatterns), - new KibanaWrappedRestHandler(new RestDeleteAction(), allowedIndexPatterns), - new KibanaWrappedRestHandler(new RestDeleteByQueryAction(), allowedIndexPatterns), - - // api used for testing - new KibanaWrappedRestHandler(new RestUpdateSettingsAction(), allowedIndexPatterns), - - // apis used specifically by reporting - new KibanaWrappedRestHandler(new RestGetIndicesAction(), allowedIndexPatterns), - new KibanaWrappedRestHandler(new RestIndexAction(), allowedIndexPatterns), - new KibanaWrappedRestHandler(new CreateHandler(), allowedIndexPatterns), - new KibanaWrappedRestHandler(new AutoIdHandler(nodesInCluster), allowedIndexPatterns), - new KibanaWrappedRestHandler(new RestUpdateAction(), allowedIndexPatterns), - new KibanaWrappedRestHandler(new RestSearchScrollAction(), allowedIndexPatterns), - new KibanaWrappedRestHandler(new RestClearScrollAction(), allowedIndexPatterns) - ); - - } - - @Override - public List> getSettings() { - return List.of(KIBANA_INDEX_NAMES_SETTING); - } - - static class KibanaWrappedRestHandler extends BaseRestHandler.Wrapper { - - private final List allowedIndexPatterns; - - KibanaWrappedRestHandler(BaseRestHandler delegate, List allowedIndexPatterns) { - super(delegate); - this.allowedIndexPatterns = allowedIndexPatterns; - } - - @Override - public String getName() { - return "kibana_" + super.getName(); - } - - @Override - public List routes() { - return super.routes().stream().map(route -> new Route(route.getMethod(), "/_kibana" + route.getPath())) - .collect(Collectors.toUnmodifiableList()); - } - - @Override - protected RestChannelConsumer prepareRequest(RestRequest request, NodeClient client) throws IOException { - client.threadPool().getThreadContext().allowSystemIndexAccess(allowedIndexPatterns); - return super.prepareRequest(request, client); - } - } -} diff --git a/modules/kibana/src/test/java/org/elasticsearch/kibana/KibanaPluginTests.java b/modules/kibana/src/test/java/org/elasticsearch/kibana/KibanaPluginTests.java deleted file mode 100644 index 5094dd7178bcb..0000000000000 --- a/modules/kibana/src/test/java/org/elasticsearch/kibana/KibanaPluginTests.java +++ /dev/null @@ -1,46 +0,0 @@ - -/* - * Licensed to Elasticsearch under one or more contributor - * license agreements. See the NOTICE file distributed with - * this work for additional information regarding copyright - * ownership. Elasticsearch licenses this file to you under - * the Apache License, Version 2.0 (the "License"); you may - * not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, - * software distributed under the License is distributed on an - * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY - * KIND, either express or implied. See the License for the - * specific language governing permissions and limitations - * under the License. - */ - -package org.elasticsearch.kibana; - -import org.elasticsearch.common.settings.Settings; -import org.elasticsearch.indices.SystemIndexDescriptor; -import org.elasticsearch.test.ESTestCase; - -import java.util.List; -import java.util.stream.Collectors; - -import static org.hamcrest.Matchers.contains; -import static org.hamcrest.Matchers.is; - -public class KibanaPluginTests extends ESTestCase { - - public void testKibanaIndexNames() { - assertThat(new KibanaPlugin().getSettings(), contains(KibanaPlugin.KIBANA_INDEX_NAMES_SETTING)); - assertThat(new KibanaPlugin().getSystemIndexDescriptors(Settings.EMPTY).stream() - .map(SystemIndexDescriptor::getIndexPattern).collect(Collectors.toUnmodifiableList()), - contains(".kibana*", ".reporting")); - final List names = List.of("." + randomAlphaOfLength(4), "." + randomAlphaOfLength(6)); - final List namesFromDescriptors = new KibanaPlugin().getSystemIndexDescriptors( - Settings.builder().putList(KibanaPlugin.KIBANA_INDEX_NAMES_SETTING.getKey(), names).build() - ).stream().map(SystemIndexDescriptor::getIndexPattern).collect(Collectors.toUnmodifiableList()); - assertThat(namesFromDescriptors, is(names)); - } -} diff --git a/modules/kibana/src/test/java/org/elasticsearch/kibana/KibanaSystemIndexIT.java b/modules/kibana/src/test/java/org/elasticsearch/kibana/KibanaSystemIndexIT.java deleted file mode 100644 index f3901112e839f..0000000000000 --- a/modules/kibana/src/test/java/org/elasticsearch/kibana/KibanaSystemIndexIT.java +++ /dev/null @@ -1,249 +0,0 @@ -/* - * Licensed to Elasticsearch under one or more contributor - * license agreements. See the NOTICE file distributed with - * this work for additional information regarding copyright - * ownership. Elasticsearch licenses this file to you under - * the Apache License, Version 2.0 (the "License"); you may - * not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, - * software distributed under the License is distributed on an - * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY - * KIND, either express or implied. See the License for the - * specific language governing permissions and limitations - * under the License. - */ - -package org.elasticsearch.kibana; - -import org.apache.http.util.EntityUtils; -import org.elasticsearch.client.Request; -import org.elasticsearch.client.Response; -import org.elasticsearch.common.xcontent.XContentHelper; -import org.elasticsearch.common.xcontent.json.JsonXContent; -import org.elasticsearch.test.rest.ESRestTestCase; - -import java.io.IOException; -import java.util.Map; - -import static org.hamcrest.Matchers.containsString; -import static org.hamcrest.Matchers.is; - -public class KibanaSystemIndexIT extends ESRestTestCase { - - public void testCreateIndex() throws IOException { - Request request = new Request("PUT", "/_kibana/.kibana-1"); - Response response = client().performRequest(request); - assertThat(response.getStatusLine().getStatusCode(), is(200)); - } - - public void testAliases() throws IOException { - Request request = new Request("PUT", "/_kibana/.kibana-1"); - Response response = client().performRequest(request); - assertThat(response.getStatusLine().getStatusCode(), is(200)); - - request = new Request("PUT", "/_kibana/.kibana-1/_alias/.kibana"); - response = client().performRequest(request); - assertThat(response.getStatusLine().getStatusCode(), is(200)); - - request = new Request("GET", "/_kibana/_aliases"); - response = client().performRequest(request); - assertThat(response.getStatusLine().getStatusCode(), is(200)); - assertThat(EntityUtils.toString(response.getEntity()), containsString(".kibana")); - } - - public void testBulkToKibanaIndex() throws IOException { - Request request = new Request("POST", "/_kibana/_bulk"); - request.setJsonEntity("{ \"index\" : { \"_index\" : \".kibana\", \"_id\" : \"1\" } }\n{ \"foo\" : \"bar\" }\n"); - Response response = client().performRequest(request); - assertThat(response.getStatusLine().getStatusCode(), is(200)); - } - - public void testRefresh() throws IOException { - Request request = new Request("POST", "/_kibana/_bulk"); - request.setJsonEntity("{ \"index\" : { \"_index\" : \".kibana\", \"_id\" : \"1\" } }\n{ \"foo\" : \"bar\" }\n"); - Response response = client().performRequest(request); - assertThat(response.getStatusLine().getStatusCode(), is(200)); - - request = new Request("GET", "/_kibana/.kibana/_refresh"); - response = client().performRequest(request); - assertThat(response.getStatusLine().getStatusCode(), is(200)); - - Request getRequest = new Request("GET", "/_kibana/.kibana/_doc/1"); - Response getResponse = client().performRequest(getRequest); - assertThat(getResponse.getStatusLine().getStatusCode(), is(200)); - String responseBody = EntityUtils.toString(getResponse.getEntity()); - assertThat(responseBody, containsString("foo")); - assertThat(responseBody, containsString("bar")); - } - - public void testGetFromKibanaIndex() throws IOException { - Request request = new Request("POST", "/_kibana/_bulk"); - request.setJsonEntity("{ \"index\" : { \"_index\" : \".kibana\", \"_id\" : \"1\" } }\n{ \"foo\" : \"bar\" }\n"); - request.addParameter("refresh", "true"); - - Response response = client().performRequest(request); - assertThat(response.getStatusLine().getStatusCode(), is(200)); - - Request getRequest = new Request("GET", "/_kibana/.kibana/_doc/1"); - Response getResponse = client().performRequest(getRequest); - assertThat(getResponse.getStatusLine().getStatusCode(), is(200)); - String responseBody = EntityUtils.toString(getResponse.getEntity()); - assertThat(responseBody, containsString("foo")); - assertThat(responseBody, containsString("bar")); - } - - public void testMultiGetFromKibanaIndex() throws IOException { - Request request = new Request("POST", "/_kibana/_bulk"); - request.setJsonEntity("{ \"index\" : { \"_index\" : \".kibana\", \"_id\" : \"1\" } }\n{ \"foo\" : \"bar\" }\n" + - "{ \"index\" : { \"_index\" : \".kibana\", \"_id\" : \"2\" } }\n{ \"baz\" : \"tag\" }\n"); - request.addParameter("refresh", "true"); - - Response response = client().performRequest(request); - assertThat(response.getStatusLine().getStatusCode(), is(200)); - - Request getRequest = new Request("GET", "/_kibana/_mget"); - getRequest.setJsonEntity("{ \"docs\" : [ { \"_index\" : \".kibana\", \"_id\" : \"1\" }, " + - "{ \"_index\" : \".kibana\", \"_id\" : \"2\" } ] }\n"); - Response getResponse = client().performRequest(getRequest); - assertThat(getResponse.getStatusLine().getStatusCode(), is(200)); - String responseBody = EntityUtils.toString(getResponse.getEntity()); - assertThat(responseBody, containsString("foo")); - assertThat(responseBody, containsString("bar")); - assertThat(responseBody, containsString("baz")); - assertThat(responseBody, containsString("tag")); - } - - public void testSearchFromKibanaIndex() throws IOException { - Request request = new Request("POST", "/_kibana/_bulk"); - request.setJsonEntity("{ \"index\" : { \"_index\" : \".kibana\", \"_id\" : \"1\" } }\n{ \"foo\" : \"bar\" }\n" + - "{ \"index\" : { \"_index\" : \".kibana\", \"_id\" : \"2\" } }\n{ \"baz\" : \"tag\" }\n"); - request.addParameter("refresh", "true"); - - Response response = client().performRequest(request); - assertThat(response.getStatusLine().getStatusCode(), is(200)); - - Request searchRequest = new Request("GET", "/_kibana/.kibana/_search"); - searchRequest.setJsonEntity("{ \"query\" : { \"match_all\" : {} } }\n"); - Response getResponse = client().performRequest(searchRequest); - assertThat(getResponse.getStatusLine().getStatusCode(), is(200)); - String responseBody = EntityUtils.toString(getResponse.getEntity()); - assertThat(responseBody, containsString("foo")); - assertThat(responseBody, containsString("bar")); - assertThat(responseBody, containsString("baz")); - assertThat(responseBody, containsString("tag")); - } - - public void testDeleteFromKibanaIndex() throws IOException { - Request request = new Request("POST", "/_kibana/_bulk"); - request.setJsonEntity("{ \"index\" : { \"_index\" : \".kibana\", \"_id\" : \"1\" } }\n{ \"foo\" : \"bar\" }\n" + - "{ \"index\" : { \"_index\" : \".kibana\", \"_id\" : \"2\" } }\n{ \"baz\" : \"tag\" }\n"); - request.addParameter("refresh", "true"); - - Response response = client().performRequest(request); - assertThat(response.getStatusLine().getStatusCode(), is(200)); - - Request deleteRequest = new Request("DELETE", "/_kibana/.kibana/_doc/1"); - Response deleteResponse = client().performRequest(deleteRequest); - assertThat(deleteResponse.getStatusLine().getStatusCode(), is(200)); - } - - public void testDeleteByQueryFromKibanaIndex() throws IOException { - Request request = new Request("POST", "/_kibana/_bulk"); - request.setJsonEntity("{ \"index\" : { \"_index\" : \".kibana\", \"_id\" : \"1\" } }\n{ \"foo\" : \"bar\" }\n" + - "{ \"index\" : { \"_index\" : \".kibana\", \"_id\" : \"2\" } }\n{ \"baz\" : \"tag\" }\n"); - request.addParameter("refresh", "true"); - - Response response = client().performRequest(request); - assertThat(response.getStatusLine().getStatusCode(), is(200)); - - Request dbqRequest = new Request("POST", "/_kibana/.kibana/_delete_by_query"); - dbqRequest.setJsonEntity("{ \"query\" : { \"match_all\" : {} } }\n"); - Response dbqResponse = client().performRequest(dbqRequest); - assertThat(dbqResponse.getStatusLine().getStatusCode(), is(200)); - } - - public void testUpdateIndexSettings() throws IOException { - Request request = new Request("PUT", "/_kibana/.kibana-1"); - Response response = client().performRequest(request); - assertThat(response.getStatusLine().getStatusCode(), is(200)); - - request = new Request("PUT", "/_kibana/.kibana-1/_settings"); - request.setJsonEntity("{ \"index.blocks.read_only\" : false }"); - response = client().performRequest(request); - assertThat(response.getStatusLine().getStatusCode(), is(200)); - } - - public void testGetIndex() throws IOException { - Request request = new Request("PUT", "/_kibana/.kibana-1"); - Response response = client().performRequest(request); - assertThat(response.getStatusLine().getStatusCode(), is(200)); - - request = new Request("GET", "/_kibana/.kibana-1"); - response = client().performRequest(request); - assertThat(response.getStatusLine().getStatusCode(), is(200)); - assertThat(EntityUtils.toString(response.getEntity()), containsString(".kibana-1")); - } - - public void testIndexingAndUpdatingDocs() throws IOException { - Request request = new Request("PUT", "/_kibana/.kibana-1/_doc/1"); - request.setJsonEntity("{ \"foo\" : \"bar\" }"); - Response response = client().performRequest(request); - assertThat(response.getStatusLine().getStatusCode(), is(201)); - - request = new Request("PUT", "/_kibana/.kibana-1/_create/2"); - request.setJsonEntity("{ \"foo\" : \"bar\" }"); - response = client().performRequest(request); - assertThat(response.getStatusLine().getStatusCode(), is(201)); - - request = new Request("POST", "/_kibana/.kibana-1/_doc"); - request.setJsonEntity("{ \"foo\" : \"bar\" }"); - response = client().performRequest(request); - assertThat(response.getStatusLine().getStatusCode(), is(201)); - - request = new Request("GET", "/_kibana/.kibana-1/_refresh"); - response = client().performRequest(request); - assertThat(response.getStatusLine().getStatusCode(), is(200)); - - request = new Request("POST", "/_kibana/.kibana-1/_update/1"); - request.setJsonEntity("{ \"doc\" : { \"foo\" : \"baz\" } }"); - response = client().performRequest(request); - assertThat(response.getStatusLine().getStatusCode(), is(200)); - } - - public void testScrollingDocs() throws IOException { - Request request = new Request("POST", "/_kibana/_bulk"); - request.setJsonEntity("{ \"index\" : { \"_index\" : \".kibana\", \"_id\" : \"1\" } }\n{ \"foo\" : \"bar\" }\n" + - "{ \"index\" : { \"_index\" : \".kibana\", \"_id\" : \"2\" } }\n{ \"baz\" : \"tag\" }\n" + - "{ \"index\" : { \"_index\" : \".kibana\", \"_id\" : \"3\" } }\n{ \"baz\" : \"tag\" }\n"); - request.addParameter("refresh", "true"); - Response response = client().performRequest(request); - assertThat(response.getStatusLine().getStatusCode(), is(200)); - - Request searchRequest = new Request("GET", "/_kibana/.kibana/_search"); - searchRequest.setJsonEntity("{ \"size\" : 1,\n\"query\" : { \"match_all\" : {} } }\n"); - searchRequest.addParameter("scroll", "1m"); - response = client().performRequest(searchRequest); - assertThat(response.getStatusLine().getStatusCode(), is(200)); - Map map = XContentHelper.convertToMap(JsonXContent.jsonXContent, EntityUtils.toString(response.getEntity()), false); - assertNotNull(map.get("_scroll_id")); - String scrollId = (String) map.get("_scroll_id"); - - Request scrollRequest = new Request("POST", "/_kibana/_search/scroll"); - scrollRequest.addParameter("scroll_id", scrollId); - scrollRequest.addParameter("scroll", "1m"); - response = client().performRequest(scrollRequest); - assertThat(response.getStatusLine().getStatusCode(), is(200)); - map = XContentHelper.convertToMap(JsonXContent.jsonXContent, EntityUtils.toString(response.getEntity()), false); - assertNotNull(map.get("_scroll_id")); - scrollId = (String) map.get("_scroll_id"); - - Request clearScrollRequest = new Request("DELETE", "/_kibana/_search/scroll"); - clearScrollRequest.addParameter("scroll_id", scrollId); - response = client().performRequest(clearScrollRequest); - assertThat(response.getStatusLine().getStatusCode(), is(200)); - } -} diff --git a/modules/lang-expression/build.gradle b/modules/lang-expression/build.gradle index 2fd6e53effa34..d3a165c13b947 100644 --- a/modules/lang-expression/build.gradle +++ b/modules/lang-expression/build.gradle @@ -29,6 +29,11 @@ dependencies { compile 'org.ow2.asm:asm-commons:5.0.4' compile 'org.ow2.asm:asm-tree:5.0.4' } +restResources { + restApi { + includeCore '_common', 'indices', 'index', 'cluster', 'nodes', 'search' + } +} dependencyLicenses { mapping from: /lucene-.*/, to: 'lucene' diff --git a/modules/lang-expression/licenses/lucene-expressions-8.5.0-snapshot-7f057455901.jar.sha1 b/modules/lang-expression/licenses/lucene-expressions-8.5.0-snapshot-7f057455901.jar.sha1 deleted file mode 100644 index 19c383c96f0a0..0000000000000 --- a/modules/lang-expression/licenses/lucene-expressions-8.5.0-snapshot-7f057455901.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -1219c9aca51a37ea3e22cf88ad2e8745d1a6e02f \ No newline at end of file diff --git a/modules/lang-expression/licenses/lucene-expressions-8.5.0.jar.sha1 b/modules/lang-expression/licenses/lucene-expressions-8.5.0.jar.sha1 new file mode 100644 index 0000000000000..ea1af2536868e --- /dev/null +++ b/modules/lang-expression/licenses/lucene-expressions-8.5.0.jar.sha1 @@ -0,0 +1 @@ +41fcbae8104c54487c83c002cf3da6a13065b7a4 \ No newline at end of file diff --git a/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/CountMethodValueSource.java b/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/CountMethodValueSource.java index d70c6790317f9..7d15d9a9ba83f 100644 --- a/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/CountMethodValueSource.java +++ b/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/CountMethodValueSource.java @@ -19,44 +19,36 @@ package org.elasticsearch.script.expression; -import java.io.IOException; -import java.util.Map; -import java.util.Objects; - import org.apache.lucene.index.LeafReaderContext; -import org.apache.lucene.queries.function.FunctionValues; -import org.apache.lucene.queries.function.ValueSource; -import org.apache.lucene.queries.function.docvalues.DoubleDocValues; +import org.apache.lucene.search.DoubleValues; import org.elasticsearch.index.fielddata.LeafNumericFieldData; import org.elasticsearch.index.fielddata.IndexFieldData; import org.elasticsearch.index.fielddata.SortedNumericDoubleValues; +import java.io.IOException; + /** * A ValueSource to create FunctionValues to get the count of the number of values in a field for a document. */ -final class CountMethodValueSource extends ValueSource { - IndexFieldData fieldData; +final class CountMethodValueSource extends FieldDataBasedDoubleValuesSource { CountMethodValueSource(IndexFieldData fieldData) { - Objects.requireNonNull(fieldData); - - this.fieldData = fieldData; + super(fieldData); } @Override - @SuppressWarnings("rawtypes") // ValueSource uses a rawtype - public FunctionValues getValues(Map context, LeafReaderContext leaf) throws IOException { - LeafNumericFieldData leafData = (LeafNumericFieldData) fieldData.load(leaf); + public DoubleValues getValues(LeafReaderContext ctx, DoubleValues scores) { + LeafNumericFieldData leafData = (LeafNumericFieldData) fieldData.load(ctx); final SortedNumericDoubleValues values = leafData.getDoubleValues(); + return new DoubleValues() { + @Override + public double doubleValue() { + return values.docValueCount(); + } - return new DoubleDocValues(this) { @Override - public double doubleVal(int doc) throws IOException { - if (values.advanceExact(doc)) { - return values.docValueCount(); - } else { - return 0; - } + public boolean advanceExact(int doc) throws IOException { + return values.advanceExact(doc); } }; } @@ -65,19 +57,18 @@ public double doubleVal(int doc) throws IOException { public boolean equals(Object o) { if (this == o) return true; if (o == null || getClass() != o.getClass()) return false; - - FieldDataValueSource that = (FieldDataValueSource) o; - + CountMethodValueSource that = (CountMethodValueSource) o; return fieldData.equals(that.fieldData); } @Override - public int hashCode() { - return 31 * getClass().hashCode() + fieldData.hashCode(); + public String toString() { + return "count: field(" + fieldData.getFieldName() + ")"; } @Override - public String description() { - return "count: field(" + fieldData.getFieldName() + ")"; + public int hashCode() { + return 31 * getClass().hashCode() + fieldData.hashCode(); } + } diff --git a/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/DateField.java b/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/DateField.java index 2d50f160812dc..3ce572cc43710 100644 --- a/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/DateField.java +++ b/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/DateField.java @@ -19,12 +19,12 @@ * under the License. */ -import java.util.Calendar; - -import org.apache.lucene.queries.function.ValueSource; +import org.apache.lucene.search.DoubleValuesSource; import org.elasticsearch.index.fielddata.IndexFieldData; import org.elasticsearch.search.MultiValueMode; +import java.util.Calendar; + /** * Expressions API for date fields. */ @@ -56,7 +56,7 @@ private DateField() {} static final String GET_MINUTES_METHOD = "getMinutes"; static final String GET_SECONDS_METHOD = "getSeconds"; - static ValueSource getVariable(IndexFieldData fieldData, String fieldName, String variable) { + static DoubleValuesSource getVariable(IndexFieldData fieldData, String fieldName, String variable) { switch (variable) { case VALUE_VARIABLE: return new FieldDataValueSource(fieldData, MultiValueMode.MIN); @@ -69,7 +69,7 @@ static ValueSource getVariable(IndexFieldData fieldData, String fieldName, St } } - static ValueSource getMethod(IndexFieldData fieldData, String fieldName, String method) { + static DoubleValuesSource getMethod(IndexFieldData fieldData, String fieldName, String method) { switch (method) { case GETVALUE_METHOD: return new FieldDataValueSource(fieldData, MultiValueMode.MIN); diff --git a/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/DateMethodValueSource.java b/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/DateMethodValueSource.java index ee8c604a39ba2..f1c918e2d2447 100644 --- a/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/DateMethodValueSource.java +++ b/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/DateMethodValueSource.java @@ -19,21 +19,19 @@ package org.elasticsearch.script.expression; -import java.io.IOException; -import java.util.Calendar; -import java.util.Locale; -import java.util.Map; -import java.util.Objects; -import java.util.TimeZone; - import org.apache.lucene.index.LeafReaderContext; -import org.apache.lucene.queries.function.FunctionValues; -import org.apache.lucene.queries.function.docvalues.DoubleDocValues; +import org.apache.lucene.search.DoubleValues; import org.elasticsearch.index.fielddata.LeafNumericFieldData; import org.elasticsearch.index.fielddata.IndexFieldData; import org.elasticsearch.index.fielddata.NumericDoubleValues; import org.elasticsearch.search.MultiValueMode; +import java.io.IOException; +import java.util.Calendar; +import java.util.Locale; +import java.util.Objects; +import java.util.TimeZone; + /** Extracts a portion of a date field with {@code Calendar.get()} */ class DateMethodValueSource extends FieldDataValueSource { @@ -50,27 +48,26 @@ class DateMethodValueSource extends FieldDataValueSource { } @Override - @SuppressWarnings("rawtypes") // ValueSource uses a rawtype - public FunctionValues getValues(Map context, LeafReaderContext leaf) throws IOException { + public DoubleValues getValues(LeafReaderContext leaf, DoubleValues scores) { LeafNumericFieldData leafData = (LeafNumericFieldData) fieldData.load(leaf); final Calendar calendar = Calendar.getInstance(TimeZone.getTimeZone("UTC"), Locale.ROOT); NumericDoubleValues docValues = multiValueMode.select(leafData.getDoubleValues()); - return new DoubleDocValues(this) { + return new DoubleValues() { + @Override + public double doubleValue() throws IOException { + calendar.setTimeInMillis((long)docValues.doubleValue()); + return calendar.get(calendarType); + } + @Override - public double doubleVal(int docId) throws IOException { - if (docValues.advanceExact(docId)) { - long millis = (long)docValues.doubleValue(); - calendar.setTimeInMillis(millis); - return calendar.get(calendarType); - } else { - return 0; - } + public boolean advanceExact(int doc) throws IOException { + return docValues.advanceExact(doc); } }; } @Override - public String description() { + public String toString() { return methodName + ": field(" + fieldData.getFieldName() + ")"; } diff --git a/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/DateObject.java b/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/DateObject.java index 66cc56370a32d..e5d12bcff1152 100644 --- a/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/DateObject.java +++ b/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/DateObject.java @@ -19,7 +19,7 @@ * under the License. */ -import org.apache.lucene.queries.function.ValueSource; +import org.apache.lucene.search.DoubleValuesSource; import org.elasticsearch.index.fielddata.IndexFieldData; import org.elasticsearch.search.MultiValueMode; import org.joda.time.ReadableDateTime; @@ -30,7 +30,7 @@ final class DateObject { // no instance private DateObject() {} - + // supported variables static final String CENTURY_OF_ERA_VARIABLE = "centuryOfEra"; static final String DAY_OF_MONTH_VARIABLE = "dayOfMonth"; @@ -50,7 +50,7 @@ private DateObject() {} static final String YEAR_VARIABLE = "year"; static final String YEAR_OF_CENTURY_VARIABLE = "yearOfCentury"; static final String YEAR_OF_ERA_VARIABLE = "yearOfEra"; - + // supported methods static final String GETCENTURY_OF_ERA_METHOD = "getCenturyOfEra"; static final String GETDAY_OF_MONTH_METHOD = "getDayOfMonth"; @@ -70,8 +70,8 @@ private DateObject() {} static final String GETYEAR_METHOD = "getYear"; static final String GETYEAR_OF_CENTURY_METHOD = "getYearOfCentury"; static final String GETYEAR_OF_ERA_METHOD = "getYearOfEra"; - - static ValueSource getVariable(IndexFieldData fieldData, String fieldName, String variable) { + + static DoubleValuesSource getVariable(IndexFieldData fieldData, String fieldName, String variable) { switch (variable) { case CENTURY_OF_ERA_VARIABLE: return new DateObjectValueSource(fieldData, MultiValueMode.MIN, variable, ReadableDateTime::getCenturyOfEra); @@ -110,12 +110,12 @@ static ValueSource getVariable(IndexFieldData fieldData, String fieldName, St case YEAR_OF_ERA_VARIABLE: return new DateObjectValueSource(fieldData, MultiValueMode.MIN, variable, ReadableDateTime::getYearOfEra); default: - throw new IllegalArgumentException("Member variable [" + variable + + throw new IllegalArgumentException("Member variable [" + variable + "] does not exist for date object on field [" + fieldName + "]."); } } - - static ValueSource getMethod(IndexFieldData fieldData, String fieldName, String method) { + + static DoubleValuesSource getMethod(IndexFieldData fieldData, String fieldName, String method) { switch (method) { case GETCENTURY_OF_ERA_METHOD: return new DateObjectValueSource(fieldData, MultiValueMode.MIN, method, ReadableDateTime::getCenturyOfEra); @@ -154,7 +154,7 @@ static ValueSource getMethod(IndexFieldData fieldData, String fieldName, Stri case GETYEAR_OF_ERA_METHOD: return new DateObjectValueSource(fieldData, MultiValueMode.MIN, method, ReadableDateTime::getYearOfEra); default: - throw new IllegalArgumentException("Member method [" + method + + throw new IllegalArgumentException("Member method [" + method + "] does not exist for date object on field [" + fieldName + "]."); } } diff --git a/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/DateObjectValueSource.java b/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/DateObjectValueSource.java index 89f991c174d30..d49dd6c0a1943 100644 --- a/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/DateObjectValueSource.java +++ b/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/DateObjectValueSource.java @@ -19,14 +19,8 @@ package org.elasticsearch.script.expression; -import java.io.IOException; -import java.util.Map; -import java.util.Objects; -import java.util.function.ToIntFunction; - import org.apache.lucene.index.LeafReaderContext; -import org.apache.lucene.queries.function.FunctionValues; -import org.apache.lucene.queries.function.docvalues.DoubleDocValues; +import org.apache.lucene.search.DoubleValues; import org.elasticsearch.index.fielddata.LeafNumericFieldData; import org.elasticsearch.index.fielddata.IndexFieldData; import org.elasticsearch.index.fielddata.NumericDoubleValues; @@ -35,6 +29,10 @@ import org.joda.time.MutableDateTime; import org.joda.time.ReadableDateTime; +import java.io.IOException; +import java.util.Objects; +import java.util.function.ToIntFunction; + /** Extracts a portion of a date field with joda time */ class DateObjectValueSource extends FieldDataValueSource { @@ -52,27 +50,26 @@ class DateObjectValueSource extends FieldDataValueSource { } @Override - @SuppressWarnings("rawtypes") // ValueSource uses a rawtype - public FunctionValues getValues(Map context, LeafReaderContext leaf) throws IOException { + public DoubleValues getValues(LeafReaderContext leaf, DoubleValues scores) { LeafNumericFieldData leafData = (LeafNumericFieldData) fieldData.load(leaf); MutableDateTime joda = new MutableDateTime(0, DateTimeZone.UTC); NumericDoubleValues docValues = multiValueMode.select(leafData.getDoubleValues()); - return new DoubleDocValues(this) { + return DoubleValues.withDefault(new DoubleValues() { + @Override + public double doubleValue() throws IOException { + joda.setMillis((long)docValues.doubleValue()); + return function.applyAsInt(joda); + } + @Override - public double doubleVal(int docId) throws IOException { - if (docValues.advanceExact(docId)) { - long millis = (long)docValues.doubleValue(); - joda.setMillis(millis); - return function.applyAsInt(joda); - } else { - return 0; - } + public boolean advanceExact(int doc) throws IOException { + return docValues.advanceExact(doc); } - }; + }, 0); } @Override - public String description() { + public String toString() { return methodName + ": field(" + fieldData.getFieldName() + ")"; } diff --git a/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/EmptyMemberValueSource.java b/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/EmptyMemberValueSource.java index 750b03762a7f9..26d300547fbac 100644 --- a/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/EmptyMemberValueSource.java +++ b/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/EmptyMemberValueSource.java @@ -20,44 +20,39 @@ package org.elasticsearch.script.expression; import org.apache.lucene.index.LeafReaderContext; -import org.apache.lucene.queries.function.FunctionValues; -import org.apache.lucene.queries.function.ValueSource; -import org.apache.lucene.queries.function.docvalues.DoubleDocValues; +import org.apache.lucene.search.DoubleValues; import org.elasticsearch.index.fielddata.LeafNumericFieldData; import org.elasticsearch.index.fielddata.IndexFieldData; import org.elasticsearch.index.fielddata.SortedNumericDoubleValues; import java.io.IOException; -import java.util.Map; -import java.util.Objects; /** * ValueSource to return non-zero if a field is missing. *

* This is essentially sugar over !count() */ -final class EmptyMemberValueSource extends ValueSource { - final IndexFieldData fieldData; +final class EmptyMemberValueSource extends FieldDataBasedDoubleValuesSource { EmptyMemberValueSource(IndexFieldData fieldData) { - this.fieldData = Objects.requireNonNull(fieldData); + super(fieldData); } @Override - @SuppressWarnings("rawtypes") // ValueSource uses a rawtype - public FunctionValues getValues(Map context, LeafReaderContext leaf) throws IOException { + public DoubleValues getValues(LeafReaderContext leaf, DoubleValues scores) { LeafNumericFieldData leafData = (LeafNumericFieldData) fieldData.load(leaf); final SortedNumericDoubleValues values = leafData.getDoubleValues(); - return new DoubleDocValues(this) { + return DoubleValues.withDefault(new DoubleValues() { @Override - public double doubleVal(int doc) throws IOException { - if (values.advanceExact(doc)) { - return 0; - } else { - return 1; - } + public double doubleValue() { + return 0; } - }; + + @Override + public boolean advanceExact(int doc) throws IOException { + return values.advanceExact(doc); + } + }, 1); } @Override @@ -71,12 +66,12 @@ public boolean equals(Object obj) { if (obj == null) return false; if (getClass() != obj.getClass()) return false; EmptyMemberValueSource other = (EmptyMemberValueSource) obj; - if (!fieldData.equals(other.fieldData)) return false; - return true; + return fieldData.equals(other.fieldData); } @Override - public String description() { + public String toString() { return "empty: field(" + fieldData.getFieldName() + ")"; } + } diff --git a/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/ExpressionScriptEngine.java b/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/ExpressionScriptEngine.java index b89a6231a67da..19865bd361e6c 100644 --- a/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/ExpressionScriptEngine.java +++ b/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/ExpressionScriptEngine.java @@ -23,8 +23,8 @@ import org.apache.lucene.expressions.SimpleBindings; import org.apache.lucene.expressions.js.JavascriptCompiler; import org.apache.lucene.expressions.js.VariableContext; -import org.apache.lucene.queries.function.ValueSource; import org.apache.lucene.queries.function.valuesource.DoubleConstValueSource; +import org.apache.lucene.search.DoubleValuesSource; import org.apache.lucene.search.SortField; import org.elasticsearch.SpecialPermission; import org.elasticsearch.common.Nullable; @@ -251,9 +251,9 @@ private static NumberSortScript.LeafFactory newSortScript(Expression expr, Searc } else { // delegate valuesource creation based on field's type // there are three types of "fields" to expressions, and each one has a different "api" of variables and methods. - final ValueSource valueSource = getDocValueSource(variable, lookup); - needsScores |= valueSource.getSortField(false).needsScores(); - bindings.add(variable, valueSource.asDoubleValuesSource()); + final DoubleValuesSource valueSource = getDocValueSource(variable, lookup); + needsScores |= valueSource.needsScores(); + bindings.add(variable, valueSource); } } catch (Exception e) { // we defer "binding" of variables until here: give context for that variable @@ -275,8 +275,7 @@ private static TermsSetQueryScript.LeafFactory newTermsSetQueryScript(Expression } else { // delegate valuesource creation based on field's type // there are three types of "fields" to expressions, and each one has a different "api" of variables and methods. - final ValueSource valueSource = getDocValueSource(variable, lookup); - bindings.add(variable, valueSource.asDoubleValuesSource()); + bindings.add(variable, getDocValueSource(variable, lookup)); } } catch (Exception e) { // we defer "binding" of variables until here: give context for that variable @@ -310,9 +309,9 @@ private static AggregationScript.LeafFactory newAggregationScript(Expression exp } else { // delegate valuesource creation based on field's type // there are three types of "fields" to expressions, and each one has a different "api" of variables and methods. - final ValueSource valueSource = getDocValueSource(variable, lookup); - needsScores |= valueSource.getSortField(false).needsScores(); - bindings.add(variable, valueSource.asDoubleValuesSource()); + final DoubleValuesSource valueSource = getDocValueSource(variable, lookup); + needsScores |= valueSource.needsScores(); + bindings.add(variable, valueSource); } } catch (Exception e) { // we defer "binding" of variables until here: give context for that variable @@ -329,8 +328,7 @@ private static FieldScript.LeafFactory newFieldScript(Expression expr, SearchLoo if (vars != null && vars.containsKey(variable)) { bindFromParams(vars, bindings, variable); } else { - final ValueSource valueSource = getDocValueSource(variable, lookup); - bindings.add(variable, valueSource.asDoubleValuesSource()); + bindings.add(variable, getDocValueSource(variable, lookup)); } } catch (Exception e) { throw convertToScriptException("link error", expr.sourceText, variable, e); @@ -382,9 +380,9 @@ private static ScoreScript.LeafFactory newScoreScript(Expression expr, SearchLoo } else { // delegate valuesource creation based on field's type // there are three types of "fields" to expressions, and each one has a different "api" of variables and methods. - final ValueSource valueSource = getDocValueSource(variable, lookup); - needsScores |= valueSource.getSortField(false).needsScores(); - bindings.add(variable, valueSource.asDoubleValuesSource()); + final DoubleValuesSource valueSource = getDocValueSource(variable, lookup); + needsScores |= valueSource.needsScores(); + bindings.add(variable, valueSource); } } catch (Exception e) { // we defer "binding" of variables until here: give context for that variable @@ -412,7 +410,7 @@ private static ScriptException convertToScriptException(String message, String s throw new ScriptException(message, cause, stack, source, NAME); } - private static ValueSource getDocValueSource(String variable, SearchLookup lookup) throws ParseException { + private static DoubleValuesSource getDocValueSource(String variable, SearchLookup lookup) throws ParseException { VariableContext[] parts = VariableContext.parse(variable); if (parts[0].text.equals("doc") == false) { throw new ParseException("Unknown variable [" + parts[0].text + "]", 0); @@ -463,7 +461,7 @@ private static ValueSource getDocValueSource(String variable, SearchLookup looku } IndexFieldData fieldData = lookup.doc().getForField(fieldType); - final ValueSource valueSource; + final DoubleValuesSource valueSource; if (fieldType instanceof GeoPointFieldType) { // geo if (methodname == null) { diff --git a/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/FieldDataBasedDoubleValuesSource.java b/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/FieldDataBasedDoubleValuesSource.java new file mode 100644 index 0000000000000..ca36ba26bd74e --- /dev/null +++ b/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/FieldDataBasedDoubleValuesSource.java @@ -0,0 +1,53 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ + +package org.elasticsearch.script.expression; + +import org.apache.lucene.index.DocValues; +import org.apache.lucene.index.LeafReaderContext; +import org.apache.lucene.search.DoubleValuesSource; +import org.apache.lucene.search.IndexSearcher; +import org.elasticsearch.index.fielddata.IndexFieldData; + +import java.util.Objects; + +abstract class FieldDataBasedDoubleValuesSource extends DoubleValuesSource { + + FieldDataBasedDoubleValuesSource(IndexFieldData fieldData) { + this.fieldData = Objects.requireNonNull(fieldData); + } + + protected final IndexFieldData fieldData; + + @Override + public boolean needsScores() { + return false; + } + + @Override + public DoubleValuesSource rewrite(IndexSearcher reader) { + return this; + } + + @Override + public boolean isCacheable(LeafReaderContext ctx) { + return DocValues.isCacheable(ctx, fieldData.getFieldName()); + } + +} diff --git a/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/FieldDataValueSource.java b/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/FieldDataValueSource.java index 63050a07da8bf..7ab8c3dd8f8bd 100644 --- a/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/FieldDataValueSource.java +++ b/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/FieldDataValueSource.java @@ -19,29 +19,26 @@ package org.elasticsearch.script.expression; -import java.io.IOException; -import java.util.Map; -import java.util.Objects; - import org.apache.lucene.index.LeafReaderContext; -import org.apache.lucene.queries.function.FunctionValues; import org.apache.lucene.queries.function.ValueSource; -import org.apache.lucene.queries.function.docvalues.DoubleDocValues; +import org.apache.lucene.search.DoubleValues; import org.elasticsearch.index.fielddata.LeafNumericFieldData; import org.elasticsearch.index.fielddata.IndexFieldData; import org.elasticsearch.index.fielddata.NumericDoubleValues; import org.elasticsearch.search.MultiValueMode; +import java.io.IOException; +import java.util.Objects; + /** * A {@link ValueSource} wrapper for field data. */ -class FieldDataValueSource extends ValueSource { +class FieldDataValueSource extends FieldDataBasedDoubleValuesSource { - final IndexFieldData fieldData; final MultiValueMode multiValueMode; protected FieldDataValueSource(IndexFieldData fieldData, MultiValueMode multiValueMode) { - this.fieldData = Objects.requireNonNull(fieldData); + super(fieldData); this.multiValueMode = Objects.requireNonNull(multiValueMode); } @@ -65,24 +62,25 @@ public int hashCode() { } @Override - @SuppressWarnings("rawtypes") // ValueSource uses a rawtype - public FunctionValues getValues(Map context, LeafReaderContext leaf) throws IOException { + public DoubleValues getValues(LeafReaderContext leaf, DoubleValues scores) { LeafNumericFieldData leafData = (LeafNumericFieldData) fieldData.load(leaf); NumericDoubleValues docValues = multiValueMode.select(leafData.getDoubleValues()); - return new DoubleDocValues(this) { - @Override - public double doubleVal(int doc) throws IOException { - if (docValues.advanceExact(doc)) { + return new DoubleValues() { + @Override + public double doubleValue() throws IOException { return docValues.doubleValue(); - } else { - return 0; } - } + + @Override + public boolean advanceExact(int doc) throws IOException { + return docValues.advanceExact(doc); + } }; } @Override - public String description() { + public String toString() { return "field(" + fieldData.getFieldName() + ")"; } + } diff --git a/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/GeoEmptyValueSource.java b/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/GeoEmptyValueSource.java index 2949e1dd9287a..5e3a20f532e7f 100644 --- a/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/GeoEmptyValueSource.java +++ b/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/GeoEmptyValueSource.java @@ -19,43 +19,38 @@ package org.elasticsearch.script.expression; -import java.io.IOException; -import java.util.Map; -import java.util.Objects; - import org.apache.lucene.index.LeafReaderContext; -import org.apache.lucene.queries.function.FunctionValues; -import org.apache.lucene.queries.function.ValueSource; -import org.apache.lucene.queries.function.docvalues.DoubleDocValues; -import org.elasticsearch.index.fielddata.LeafGeoPointFieldData; +import org.apache.lucene.search.DoubleValues; import org.elasticsearch.index.fielddata.IndexFieldData; +import org.elasticsearch.index.fielddata.LeafGeoPointFieldData; import org.elasticsearch.index.fielddata.MultiGeoPointValues; +import java.io.IOException; + /** * ValueSource to return non-zero if a field is missing. */ -final class GeoEmptyValueSource extends ValueSource { - IndexFieldData fieldData; +final class GeoEmptyValueSource extends FieldDataBasedDoubleValuesSource { GeoEmptyValueSource(IndexFieldData fieldData) { - this.fieldData = Objects.requireNonNull(fieldData); + super(fieldData); } @Override - @SuppressWarnings("rawtypes") // ValueSource uses a rawtype - public FunctionValues getValues(Map context, LeafReaderContext leaf) throws IOException { + public DoubleValues getValues(LeafReaderContext leaf, DoubleValues scores) { LeafGeoPointFieldData leafData = (LeafGeoPointFieldData) fieldData.load(leaf); final MultiGeoPointValues values = leafData.getGeoPointValues(); - return new DoubleDocValues(this) { + return DoubleValues.withDefault(new DoubleValues() { @Override - public double doubleVal(int doc) throws IOException { - if (values.advanceExact(doc)) { - return 1; - } else { - return 0; - } + public double doubleValue() { + return 1; } - }; + + @Override + public boolean advanceExact(int doc) throws IOException { + return values.advanceExact(doc); + } + }, 0); } @Override @@ -69,12 +64,12 @@ public boolean equals(Object obj) { if (obj == null) return false; if (getClass() != obj.getClass()) return false; GeoEmptyValueSource other = (GeoEmptyValueSource) obj; - if (!fieldData.equals(other.fieldData)) return false; - return true; + return fieldData.equals(other.fieldData); } @Override - public String description() { + public String toString() { return "empty: field(" + fieldData.getFieldName() + ")"; } + } diff --git a/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/GeoField.java b/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/GeoField.java index 0ce74a187df77..4b15e30aba414 100644 --- a/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/GeoField.java +++ b/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/GeoField.java @@ -19,7 +19,7 @@ * under the License. */ -import org.apache.lucene.queries.function.ValueSource; +import org.apache.lucene.search.DoubleValuesSource; import org.elasticsearch.index.fielddata.IndexFieldData; /** @@ -28,18 +28,18 @@ final class GeoField { // no instance private GeoField() {} - + // supported variables static final String EMPTY_VARIABLE = "empty"; static final String LAT_VARIABLE = "lat"; static final String LON_VARIABLE = "lon"; - + // supported methods static final String ISEMPTY_METHOD = "isEmpty"; static final String GETLAT_METHOD = "getLat"; static final String GETLON_METHOD = "getLon"; - - static ValueSource getVariable(IndexFieldData fieldData, String fieldName, String variable) { + + static DoubleValuesSource getVariable(IndexFieldData fieldData, String fieldName, String variable) { switch (variable) { case EMPTY_VARIABLE: return new GeoEmptyValueSource(fieldData); @@ -51,8 +51,8 @@ static ValueSource getVariable(IndexFieldData fieldData, String fieldName, St throw new IllegalArgumentException("Member variable [" + variable + "] does not exist for geo field [" + fieldName + "]."); } } - - static ValueSource getMethod(IndexFieldData fieldData, String fieldName, String method) { + + static DoubleValuesSource getMethod(IndexFieldData fieldData, String fieldName, String method) { switch (method) { case ISEMPTY_METHOD: return new GeoEmptyValueSource(fieldData); diff --git a/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/GeoLatitudeValueSource.java b/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/GeoLatitudeValueSource.java index 3c04de1658669..8b9a3282ec78b 100644 --- a/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/GeoLatitudeValueSource.java +++ b/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/GeoLatitudeValueSource.java @@ -19,43 +19,38 @@ package org.elasticsearch.script.expression; -import java.io.IOException; -import java.util.Map; -import java.util.Objects; - import org.apache.lucene.index.LeafReaderContext; -import org.apache.lucene.queries.function.FunctionValues; -import org.apache.lucene.queries.function.ValueSource; -import org.apache.lucene.queries.function.docvalues.DoubleDocValues; +import org.apache.lucene.search.DoubleValues; import org.elasticsearch.index.fielddata.LeafGeoPointFieldData; import org.elasticsearch.index.fielddata.IndexFieldData; import org.elasticsearch.index.fielddata.MultiGeoPointValues; +import java.io.IOException; + /** * ValueSource to return latitudes as a double "stream" for geopoint fields */ -final class GeoLatitudeValueSource extends ValueSource { - final IndexFieldData fieldData; +final class GeoLatitudeValueSource extends FieldDataBasedDoubleValuesSource { GeoLatitudeValueSource(IndexFieldData fieldData) { - this.fieldData = Objects.requireNonNull(fieldData); + super(fieldData); } @Override - @SuppressWarnings("rawtypes") // ValueSource uses a rawtype - public FunctionValues getValues(Map context, LeafReaderContext leaf) throws IOException { + public DoubleValues getValues(LeafReaderContext leaf, DoubleValues scores) { LeafGeoPointFieldData leafData = (LeafGeoPointFieldData) fieldData.load(leaf); final MultiGeoPointValues values = leafData.getGeoPointValues(); - return new DoubleDocValues(this) { + return DoubleValues.withDefault(new DoubleValues() { + @Override + public double doubleValue() throws IOException { + return values.nextValue().getLat(); + } + @Override - public double doubleVal(int doc) throws IOException { - if (values.advanceExact(doc)) { - return values.nextValue().getLat(); - } else { - return 0.0; - } + public boolean advanceExact(int doc) throws IOException { + return values.advanceExact(doc); } - }; + }, 0); } @Override @@ -69,12 +64,11 @@ public boolean equals(Object obj) { if (obj == null) return false; if (getClass() != obj.getClass()) return false; GeoLatitudeValueSource other = (GeoLatitudeValueSource) obj; - if (!fieldData.equals(other.fieldData)) return false; - return true; + return fieldData.equals(other.fieldData); } @Override - public String description() { + public String toString() { return "lat: field(" + fieldData.getFieldName() + ")"; } } diff --git a/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/GeoLongitudeValueSource.java b/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/GeoLongitudeValueSource.java index 8b2c5d83c0257..9bd285f8da871 100644 --- a/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/GeoLongitudeValueSource.java +++ b/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/GeoLongitudeValueSource.java @@ -19,43 +19,38 @@ package org.elasticsearch.script.expression; -import java.io.IOException; -import java.util.Map; -import java.util.Objects; - import org.apache.lucene.index.LeafReaderContext; -import org.apache.lucene.queries.function.FunctionValues; -import org.apache.lucene.queries.function.ValueSource; -import org.apache.lucene.queries.function.docvalues.DoubleDocValues; +import org.apache.lucene.search.DoubleValues; import org.elasticsearch.index.fielddata.LeafGeoPointFieldData; import org.elasticsearch.index.fielddata.IndexFieldData; import org.elasticsearch.index.fielddata.MultiGeoPointValues; +import java.io.IOException; + /** * ValueSource to return longitudes as a double "stream" for geopoint fields */ -final class GeoLongitudeValueSource extends ValueSource { - final IndexFieldData fieldData; +final class GeoLongitudeValueSource extends FieldDataBasedDoubleValuesSource { GeoLongitudeValueSource(IndexFieldData fieldData) { - this.fieldData = Objects.requireNonNull(fieldData); + super(fieldData); } @Override - @SuppressWarnings("rawtypes") // ValueSource uses a rawtype - public FunctionValues getValues(Map context, LeafReaderContext leaf) throws IOException { + public DoubleValues getValues(LeafReaderContext leaf, DoubleValues scores) { LeafGeoPointFieldData leafData = (LeafGeoPointFieldData) fieldData.load(leaf); final MultiGeoPointValues values = leafData.getGeoPointValues(); - return new DoubleDocValues(this) { + return DoubleValues.withDefault(new DoubleValues() { + @Override + public double doubleValue() throws IOException { + return values.nextValue().getLon(); + } + @Override - public double doubleVal(int doc) throws IOException { - if (values.advanceExact(doc)) { - return values.nextValue().getLon(); - } else { - return 0.0; - } + public boolean advanceExact(int doc) throws IOException { + return values.advanceExact(doc); } - }; + }, 0.0); } @Override @@ -74,7 +69,7 @@ public boolean equals(Object obj) { } @Override - public String description() { + public String toString() { return "lon: field(" + fieldData.getFieldName() + ")"; } } diff --git a/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/NumericField.java b/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/NumericField.java index 06875632134d7..0eeff96ba08e4 100644 --- a/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/NumericField.java +++ b/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/NumericField.java @@ -19,7 +19,7 @@ * under the License. */ -import org.apache.lucene.queries.function.ValueSource; +import org.apache.lucene.search.DoubleValuesSource; import org.elasticsearch.index.fielddata.IndexFieldData; import org.elasticsearch.search.MultiValueMode; @@ -46,7 +46,7 @@ private NumericField() {} static final String SUM_METHOD = "sum"; static final String COUNT_METHOD = "count"; - static ValueSource getVariable(IndexFieldData fieldData, String fieldName, String variable) { + static DoubleValuesSource getVariable(IndexFieldData fieldData, String fieldName, String variable) { switch (variable) { case VALUE_VARIABLE: return new FieldDataValueSource(fieldData, MultiValueMode.MIN); @@ -60,7 +60,7 @@ static ValueSource getVariable(IndexFieldData fieldData, String fieldName, St } } - static ValueSource getMethod(IndexFieldData fieldData, String fieldName, String method) { + static DoubleValuesSource getMethod(IndexFieldData fieldData, String fieldName, String method) { switch (method) { case GETVALUE_METHOD: return new FieldDataValueSource(fieldData, MultiValueMode.MIN); diff --git a/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/ReplaceableConstDoubleValueSource.java b/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/ReplaceableConstDoubleValueSource.java index a0cfd721adb5a..c237ca3101673 100644 --- a/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/ReplaceableConstDoubleValueSource.java +++ b/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/ReplaceableConstDoubleValueSource.java @@ -80,7 +80,7 @@ public boolean isCacheable(LeafReaderContext ctx) { } @Override - public DoubleValuesSource rewrite(IndexSearcher reader) throws IOException { + public DoubleValuesSource rewrite(IndexSearcher reader) { return this; } } diff --git a/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/ReplaceableConstDoubleValues.java b/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/ReplaceableConstDoubleValues.java index 557f9ad812e0c..4efc8f9791cd9 100644 --- a/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/ReplaceableConstDoubleValues.java +++ b/modules/lang-expression/src/main/java/org/elasticsearch/script/expression/ReplaceableConstDoubleValues.java @@ -21,8 +21,6 @@ import org.apache.lucene.search.DoubleValues; -import java.io.IOException; - /** * A support class for an executable expression script that allows the double returned * by a {@link DoubleValues} to be modified. @@ -35,12 +33,12 @@ void setValue(double value) { } @Override - public double doubleValue() throws IOException { + public double doubleValue() { return value; } @Override - public boolean advanceExact(int doc) throws IOException { + public boolean advanceExact(int doc) { return true; } } diff --git a/modules/lang-mustache/build.gradle b/modules/lang-mustache/build.gradle index 5adc409c8bcc1..ed0641b89fa1a 100644 --- a/modules/lang-mustache/build.gradle +++ b/modules/lang-mustache/build.gradle @@ -26,3 +26,9 @@ dependencies { compile "com.github.spullara.mustache.java:compiler:0.9.6" } +restResources { + restApi { + includeCore '_common', 'cluster', 'nodes', 'indices', 'index', 'bulk', + 'put_script', 'render_search_template', 'search_template', 'msearch_template', 'lang_mustache' + } +} diff --git a/modules/lang-painless/build.gradle b/modules/lang-painless/build.gradle index 75e41dcf2e2aa..68a990a3dda9a 100644 --- a/modules/lang-painless/build.gradle +++ b/modules/lang-painless/build.gradle @@ -44,6 +44,13 @@ dependencyLicenses { mapping from: /asm-.*/, to: 'asm' } +restResources { + restApi { + includeCore '_common', 'cluster', 'nodes', 'indices', 'index', 'search', 'get', 'bulk', 'update', + 'scripts_painless_execute', 'put_script', 'delete_script' + } +} + test { // in WhenThingsGoWrongTests we intentionally generate an out of memory error, this prevents the heap from being dumped to disk jvmArgs '-XX:-OmitStackTraceInFastThrow', '-XX:-HeapDumpOnOutOfMemoryError' diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/Compiler.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/Compiler.java index ab09010d248d4..e9dcdb1ef409d 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/Compiler.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/Compiler.java @@ -210,8 +210,7 @@ ScriptRoot compile(Loader loader, String name, String source, CompilerSettings s ScriptClassInfo scriptClassInfo = new ScriptClassInfo(painlessLookup, scriptClass); SClass root = Walker.buildPainlessTree(scriptClassInfo, name, source, settings, painlessLookup, null); ScriptRoot scriptRoot = new ScriptRoot(painlessLookup, settings, scriptClassInfo, root); - root.analyze(scriptRoot); - ClassNode classNode = root.writeClass(); + ClassNode classNode = root.writeClass(scriptRoot); DefBootstrapInjectionPhase.phase(classNode); ScriptInjectionPhase.phase(scriptRoot, classNode); byte[] bytes = classNode.write(); @@ -240,8 +239,7 @@ byte[] compile(String name, String source, CompilerSettings settings, Printer de ScriptClassInfo scriptClassInfo = new ScriptClassInfo(painlessLookup, scriptClass); SClass root = Walker.buildPainlessTree(scriptClassInfo, name, source, settings, painlessLookup, debugStream); ScriptRoot scriptRoot = new ScriptRoot(painlessLookup, settings, scriptClassInfo, root); - root.analyze(scriptRoot); - ClassNode classNode = root.writeClass(); + ClassNode classNode = root.writeClass(scriptRoot); DefBootstrapInjectionPhase.phase(classNode); ScriptInjectionPhase.phase(scriptRoot, classNode); diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/AExpression.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/AExpression.java index 189cb94dcd70b..e7fb9e951c59e 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/AExpression.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/AExpression.java @@ -79,6 +79,18 @@ public static class Output { * called on this node to get the type of the node after the cast. */ Class actual = null; + + /** + * The {@link PainlessCast} to convert this expression's actual type + * to the parent expression's expected type. {@code null} if no cast + * is required. + */ + PainlessCast painlessCast = null; + + /** + * The {@link ExpressionNode}(s) generated from this expression. + */ + ExpressionNode expressionNode = null; } /** @@ -90,16 +102,6 @@ public static class Output { */ AExpression prefix; - // TODO: remove placeholders once analysis and write are combined into build - // TODO: https://github.com/elastic/elasticsearch/issues/53561 - // This are used to support the transition from a mutable to immutable state. - // Currently, the IR tree is built during the user tree "write" phase, so - // these are stored on the node to set during the "semantic" phase and then - // use during the "write" phase. - Input input = null; - Output output = null; - PainlessCast cast = null; - /** * Standard constructor with location used for error tracking. */ @@ -121,29 +123,24 @@ public static class Output { /** * Checks for errors and collects data for the writing phase. */ - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { throw new UnsupportedOperationException(); } - /** - * Writes ASM based on the data collected during the analysis phase. - */ - abstract ExpressionNode write(ClassNode classNode); - - void cast() { - cast = AnalyzerCaster.getLegalCast(location, output.actual, input.expected, input.explicit, input.internal); + void cast(Input input, Output output) { + output.painlessCast = AnalyzerCaster.getLegalCast(location, output.actual, input.expected, input.explicit, input.internal); } - ExpressionNode cast(ExpressionNode expressionNode) { - if (cast == null) { - return expressionNode; + ExpressionNode cast(Output output) { + if (output.painlessCast == null) { + return output.expressionNode; } CastNode castNode = new CastNode(); castNode.setLocation(location); - castNode.setExpressionType(cast.targetType); - castNode.setCast(cast); - castNode.setChildNode(expressionNode); + castNode.setExpressionType(output.painlessCast.targetType); + castNode.setCast(output.painlessCast); + castNode.setChildNode(output.expressionNode); return castNode; } diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ANode.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ANode.java index 0566399095565..4790e222e99d7 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ANode.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ANode.java @@ -20,8 +20,6 @@ package org.elasticsearch.painless.node; import org.elasticsearch.painless.Location; -import org.elasticsearch.painless.ir.ClassNode; -import org.elasticsearch.painless.ir.IRNode; import java.util.ArrayList; import java.util.Arrays; @@ -51,12 +49,6 @@ public abstract class ANode { this.location = Objects.requireNonNull(location); } - /** - * Writes ASM based on the data collected during the analysis phase. - * @param classNode the root {@link ClassNode} - */ - abstract IRNode write(ClassNode classNode); - /** * Create an error with location information pointing to this node. */ diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/AStatement.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/AStatement.java index d8d12d8a14698..06c1affc4a4a7 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/AStatement.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/AStatement.java @@ -94,13 +94,13 @@ public static class Output { * Set to the approximate number of statements in a loop block to prevent * infinite loops during runtime. */ - int statementCount = 0; - } + int statementCount = 1; - // TODO: remove placeholders once analysis and write are combined into build - // TODO: https://github.com/elastic/elasticsearch/issues/53561 - Input input; - Output output; + /** + * The {@link StatementNode}(s) generated from this expression. + */ + StatementNode statementNode = null; + } /** * Standard constructor with location used for error tracking. @@ -112,12 +112,7 @@ public static class Output { /** * Checks for errors and collects data for the writing phase. */ - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { throw new UnsupportedOperationException(); } - - /** - * Writes ASM based on the data collected during the analysis phase. - */ - abstract StatementNode write(ClassNode classNode); } diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/AStoreable.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/AStoreable.java index cc7262f670a94..4bca9bb37cbc5 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/AStoreable.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/AStoreable.java @@ -21,6 +21,7 @@ import org.elasticsearch.painless.Location; import org.elasticsearch.painless.Scope; +import org.elasticsearch.painless.ir.ClassNode; import org.elasticsearch.painless.symbol.ScriptRoot; import java.util.Objects; @@ -57,7 +58,7 @@ public static class Input extends AExpression.Input { this.prefix = Objects.requireNonNull(prefix); } - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { throw new UnsupportedOperationException(); } @@ -66,11 +67,4 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { * rhs actual type to avoid an unnecessary cast. */ abstract boolean isDefOptimized(); - - /** - * If this node or a sub-node of this node uses dynamic calls then - * actual will be set to this value. This is used for an optimization - * during assignment to def type targets. - */ - abstract void updateActual(Class actual); } diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EAssignment.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EAssignment.java index e995d637520c7..970d31935831b 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EAssignment.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EAssignment.java @@ -25,7 +25,13 @@ import org.elasticsearch.painless.Operation; import org.elasticsearch.painless.Scope; import org.elasticsearch.painless.ir.AssignmentNode; +import org.elasticsearch.painless.ir.BinaryMathNode; +import org.elasticsearch.painless.ir.BraceNode; +import org.elasticsearch.painless.ir.BraceSubDefNode; import org.elasticsearch.painless.ir.ClassNode; +import org.elasticsearch.painless.ir.DotNode; +import org.elasticsearch.painless.ir.DotSubDefNode; +import org.elasticsearch.painless.ir.ExpressionNode; import org.elasticsearch.painless.lookup.PainlessCast; import org.elasticsearch.painless.lookup.def; import org.elasticsearch.painless.symbol.ScriptRoot; @@ -37,19 +43,13 @@ /** * Represents an assignment with the lhs and rhs as child nodes. */ -public final class EAssignment extends AExpression { +public class EAssignment extends AExpression { - private AExpression lhs; - private AExpression rhs; - private final boolean pre; - private final boolean post; - private Operation operation; - - private boolean cat = false; - private Class promote = null; - private Class shiftDistance; // for shifts, the RHS is promoted independently - private PainlessCast there = null; - private PainlessCast back = null; + protected final AExpression lhs; + protected final AExpression rhs; + protected final boolean pre; + protected final boolean post; + protected final Operation operation; public EAssignment(Location location, AExpression lhs, AExpression rhs, boolean pre, boolean post, Operation operation) { super(location); @@ -62,11 +62,20 @@ public EAssignment(Location location, AExpression lhs, AExpression rhs, boolean } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); + + AExpression rhs = this.rhs; + Operation operation = this.operation; + boolean cat = false; + Class promote = null; + Class shiftDistance = null; + PainlessCast there = null; + PainlessCast back = null; Output leftOutput; + + Input rightInput = new Input(); Output rightOutput; if (lhs instanceof AStoreable) { @@ -75,7 +84,7 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { leftInput.read = input.read; leftInput.write = true; - leftOutput = lhs.analyze(scriptRoot, scope, leftInput); + leftOutput = lhs.analyze(classNode, scriptRoot, scope, leftInput); } else { throw new IllegalArgumentException("Left-hand side cannot be assigned a value."); } @@ -117,7 +126,7 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { } if (operation != null) { - rightOutput = rhs.analyze(scriptRoot, scope, new Input()); + rightOutput = rhs.analyze(classNode, scriptRoot, scope, rightInput); boolean shift = false; if (operation == Operation.MUL) { @@ -161,25 +170,25 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { if (cat) { if (rhs instanceof EBinary && ((EBinary)rhs).operation == Operation.ADD && rightOutput.actual == String.class) { - ((EBinary)rhs).cat = true; + ((BinaryMathNode)rightOutput.expressionNode).setCat(true); } } if (shift) { if (promote == def.class) { // shifts are promoted independently, but for the def type, we need object. - rhs.input.expected = promote; + rightInput.expected = promote; } else if (shiftDistance == long.class) { - rhs.input.expected = int.class; - rhs.input.explicit = true; + rightInput.expected = int.class; + rightInput.explicit = true; } else { - rhs.input.expected = shiftDistance; + rightInput.expected = shiftDistance; } } else { - rhs.input.expected = promote; + rightInput.expected = promote; } - rhs.cast(); + rhs.cast(rightInput, rightOutput); there = AnalyzerCaster.getLegalCast(location, leftOutput.actual, promote, false, false); back = AnalyzerCaster.getLegalCast(location, promote, leftOutput.actual, true, false); @@ -188,24 +197,33 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { } else if (rhs != null) { AStoreable lhs = (AStoreable)this.lhs; + // TODO: move this optimization to a later phase // If the lhs node is a def optimized node we update the actual type to remove the need for a cast. if (lhs.isDefOptimized()) { - rightOutput = rhs.analyze(scriptRoot, scope, new Input()); + rightOutput = rhs.analyze(classNode, scriptRoot, scope, rightInput); if (rightOutput.actual == void.class) { throw createError(new IllegalArgumentException("Right-hand side cannot be a [void] type for assignment.")); } - rhs.input.expected = rightOutput.actual; - lhs.updateActual(rightOutput.actual); + rightInput.expected = rightOutput.actual; + leftOutput.actual = rightOutput.actual; + leftOutput.expressionNode.setExpressionType(rightOutput.actual); + + ExpressionNode expressionNode = leftOutput.expressionNode; + + if (expressionNode instanceof DotNode && ((DotNode)expressionNode).getRightNode() instanceof DotSubDefNode) { + ((DotNode)expressionNode).getRightNode().setExpressionType(leftOutput.actual); + } else if (expressionNode instanceof BraceNode && ((BraceNode)expressionNode).getRightNode() instanceof BraceSubDefNode) { + ((BraceNode)expressionNode).getRightNode().setExpressionType(leftOutput.actual); + } // Otherwise, we must adapt the rhs type to the lhs type with a cast. } else { - Input rightInput = new Input(); rightInput.expected = leftOutput.actual; - rhs.analyze(scriptRoot, scope, rightInput); + rightOutput = rhs.analyze(classNode, scriptRoot, scope, rightInput); } - rhs.cast(); + rhs.cast(rightInput, rightOutput); } else { throw new IllegalStateException("Illegal tree structure."); } @@ -213,21 +231,10 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { output.statement = true; output.actual = input.read ? leftOutput.actual : void.class; - return output; - } - - /** - * Handles writing byte code for variable/method chains for all given possibilities - * including String concatenation, compound assignment, regular assignment, and simple - * reads. Includes proper duplication for chained assignments and assignments that are - * also read from. - */ - @Override - AssignmentNode write(ClassNode classNode) { AssignmentNode assignmentNode = new AssignmentNode(); - assignmentNode.setLeftNode(lhs.write(classNode)); - assignmentNode.setRightNode(rhs.cast(rhs.write(classNode))); + assignmentNode.setLeftNode(leftOutput.expressionNode); + assignmentNode.setRightNode(rhs.cast(rightOutput)); assignmentNode.setLocation(location); assignmentNode.setExpressionType(output.actual); @@ -240,7 +247,9 @@ AssignmentNode write(ClassNode classNode) { assignmentNode.setThere(there); assignmentNode.setBack(back); - return assignmentNode; + output.expressionNode = assignmentNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EBinary.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EBinary.java index 5a65d74d4158c..9f6fc56f97ff2 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EBinary.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EBinary.java @@ -35,16 +35,11 @@ /** * Represents a binary math expression. */ -public final class EBinary extends AExpression { +public class EBinary extends AExpression { - final Operation operation; - private AExpression left; - private AExpression right; - - private Class promote = null; // promoted type - private Class shiftDistance = null; // for shifts, the rhs is promoted independently - boolean cat = false; - private boolean originallyExplicit = false; // record whether there was originally an explicit cast + protected final Operation operation; + protected final AExpression left; + protected final AExpression right; public EBinary(Location location, Operation operation, AExpression left, AExpression right) { super(location); @@ -55,18 +50,22 @@ public EBinary(Location location, Operation operation, AExpression left, AExpres } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Class promote = null; // promoted type + Class shiftDistance = null; // for shifts, the rhs is promoted independently + boolean originallyExplicit = input.explicit; // record whether there was originally an explicit cast + + Output output = new Output(); - originallyExplicit = input.explicit; + Input leftInput = new Input(); + Output leftOutput = left.analyze(classNode, scriptRoot, scope, leftInput); - Output leftOutput = left.analyze(scriptRoot, scope, new Input()); - Output rightOutput = right.analyze(scriptRoot, scope, new Input()); + Input rightInput = new Input(); + Output rightOutput = right.analyze(classNode, scriptRoot, scope, rightInput); if (operation == Operation.FIND || operation == Operation.MATCH) { - left.input.expected = String.class; - right.input.expected = Pattern.class; + leftInput.expected = String.class; + rightInput.expected = Pattern.class; promote = boolean.class; output.actual = boolean.class; } else { @@ -99,61 +98,58 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { output.actual = promote; if (operation == Operation.ADD && promote == String.class) { - left.input.expected = leftOutput.actual; - right.input.expected = rightOutput.actual; + leftInput.expected = leftOutput.actual; + rightInput.expected = rightOutput.actual; if (left instanceof EBinary && ((EBinary) left).operation == Operation.ADD && leftOutput.actual == String.class) { - ((EBinary) left).cat = true; + ((BinaryMathNode)leftOutput.expressionNode).setCat(true); } if (right instanceof EBinary && ((EBinary) right).operation == Operation.ADD && rightOutput.actual == String.class) { - ((EBinary) right).cat = true; + ((BinaryMathNode)rightOutput.expressionNode).setCat(true); } - } else if (promote == def.class || shiftDistance != null && shiftDistance == def.class) { - left.input.expected = leftOutput.actual; - right.input.expected = rightOutput.actual; + } else if (promote == def.class || shiftDistance == def.class) { + leftInput.expected = leftOutput.actual; + rightInput.expected = rightOutput.actual; if (input.expected != null) { output.actual = input.expected; } } else { - left.input.expected = promote; + leftInput.expected = promote; if (operation == Operation.LSH || operation == Operation.RSH || operation == Operation.USH) { if (shiftDistance == long.class) { - right.input.expected = int.class; - right.input.explicit = true; + rightInput.expected = int.class; + rightInput.explicit = true; } else { - right.input.expected = shiftDistance; + rightInput.expected = shiftDistance; } } else { - right.input.expected = promote; + rightInput.expected = promote; } } } - left.cast(); - right.cast(); - - return output; - } + left.cast(leftInput, leftOutput); + right.cast(rightInput, rightOutput); - @Override - BinaryMathNode write(ClassNode classNode) { BinaryMathNode binaryMathNode = new BinaryMathNode(); - binaryMathNode.setLeftNode(left.cast(left.write(classNode))); - binaryMathNode.setRightNode(right.cast(right.write(classNode))); + binaryMathNode.setLeftNode(left.cast(leftOutput)); + binaryMathNode.setRightNode(right.cast(rightOutput)); binaryMathNode.setLocation(location); binaryMathNode.setExpressionType(output.actual); binaryMathNode.setBinaryType(promote); binaryMathNode.setShiftType(shiftDistance); binaryMathNode.setOperation(operation); - binaryMathNode.setCat(cat); + binaryMathNode.setCat(false); binaryMathNode.setOriginallExplicit(originallyExplicit); - return binaryMathNode; + output.expressionNode = binaryMathNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EBool.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EBool.java index f2acae9299f20..e9dac2123c6cd 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EBool.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EBool.java @@ -31,11 +31,11 @@ /** * Represents a boolean expression. */ -public final class EBool extends AExpression { +public class EBool extends AExpression { - private final Operation operation; - private AExpression left; - private AExpression right; + protected final Operation operation; + protected final AExpression left; + protected final AExpression right; public EBool(Location location, Operation operation, AExpression left, AExpression right) { super(location); @@ -46,37 +46,33 @@ public EBool(Location location, Operation operation, AExpression left, AExpressi } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); Input leftInput = new Input(); leftInput.expected = boolean.class; - left.analyze(scriptRoot, scope, leftInput); - left.cast(); + Output leftOutput = left.analyze(classNode, scriptRoot, scope, leftInput); + left.cast(leftInput, leftOutput); Input rightInput = new Input(); rightInput.expected = boolean.class; - right.analyze(scriptRoot, scope, rightInput); - right.cast(); + Output rightOutput = right.analyze(classNode, scriptRoot, scope, rightInput); + right.cast(rightInput, rightOutput); output.actual = boolean.class; - return output; - } - - @Override - BooleanNode write(ClassNode classNode) { BooleanNode booleanNode = new BooleanNode(); - booleanNode.setLeftNode(left.cast(left.write(classNode))); - booleanNode.setRightNode(right.cast(right.write(classNode))); + booleanNode.setLeftNode(left.cast(leftOutput)); + booleanNode.setRightNode(right.cast(rightOutput)); booleanNode.setLocation(location); booleanNode.setExpressionType(output.actual); booleanNode.setOperation(operation); - return booleanNode; + output.expressionNode = booleanNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EBoolean.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EBoolean.java index 6e78fb972008c..b1362e99cb2a5 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EBoolean.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EBoolean.java @@ -23,13 +23,12 @@ import org.elasticsearch.painless.Scope; import org.elasticsearch.painless.ir.ClassNode; import org.elasticsearch.painless.ir.ConstantNode; -import org.elasticsearch.painless.ir.ExpressionNode; import org.elasticsearch.painless.symbol.ScriptRoot; /** * Represents a boolean constant. */ -public final class EBoolean extends AExpression { +public class EBoolean extends AExpression { protected boolean constant; @@ -40,9 +39,8 @@ public EBoolean(Location location, boolean constant) { } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); if (input.read == false) { throw createError(new IllegalArgumentException("Must read from constant [" + constant + "].")); @@ -50,17 +48,14 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { output.actual = boolean.class; - return output; - } - - @Override - ExpressionNode write(ClassNode classNode) { ConstantNode constantNode = new ConstantNode(); constantNode.setLocation(location); constantNode.setExpressionType(output.actual); constantNode.setConstant(constant); - return constantNode; + output.expressionNode = constantNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ECallLocal.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ECallLocal.java index ad4dce45d4d7a..c86966df03f60 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ECallLocal.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ECallLocal.java @@ -23,6 +23,7 @@ import org.elasticsearch.painless.Scope; import org.elasticsearch.painless.ir.ClassNode; import org.elasticsearch.painless.ir.MemberCallNode; +import org.elasticsearch.painless.ir.FieldNode; import org.elasticsearch.painless.lookup.PainlessClassBinding; import org.elasticsearch.painless.lookup.PainlessInstanceBinding; import org.elasticsearch.painless.lookup.PainlessMethod; @@ -32,35 +33,35 @@ import java.lang.reflect.Modifier; import java.util.ArrayList; +import java.util.Collections; import java.util.List; import java.util.Objects; /** * Represents a user-defined call. */ -public final class ECallLocal extends AExpression { +public class ECallLocal extends AExpression { - private final String name; - private final List arguments; - - private FunctionTable.LocalFunction localFunction = null; - private PainlessMethod importedMethod = null; - private PainlessClassBinding classBinding = null; - private int classBindingOffset = 0; - private PainlessInstanceBinding instanceBinding = null; - private String bindingName = null; + protected final String name; + protected final List arguments; public ECallLocal(Location location, String name, List arguments) { super(location); this.name = Objects.requireNonNull(name); - this.arguments = Objects.requireNonNull(arguments); + this.arguments = Collections.unmodifiableList(Objects.requireNonNull(arguments)); } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + FunctionTable.LocalFunction localFunction = null; + PainlessMethod importedMethod = null; + PainlessClassBinding classBinding = null; + int classBindingOffset = 0; + PainlessInstanceBinding instanceBinding = null; + String bindingName = null; + + Output output = new Output(); localFunction = scriptRoot.getFunctionTable().getFunction(name, arguments.size()); @@ -125,43 +126,53 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { typeParameters = new ArrayList<>(classBinding.typeParameters); output.actual = classBinding.returnType; bindingName = scriptRoot.getNextSyntheticName("class_binding"); - scriptRoot.getClassNode().addField(new SField(location, - Modifier.PRIVATE, bindingName, classBinding.javaConstructor.getDeclaringClass())); + + FieldNode fieldNode = new FieldNode(); + fieldNode.setLocation(location); + fieldNode.setModifiers(Modifier.PRIVATE); + fieldNode.setFieldType(classBinding.javaConstructor.getDeclaringClass()); + fieldNode.setName(bindingName); + + classNode.addFieldNode(fieldNode); } else if (instanceBinding != null) { typeParameters = new ArrayList<>(instanceBinding.typeParameters); output.actual = instanceBinding.returnType; bindingName = scriptRoot.getNextSyntheticName("instance_binding"); - scriptRoot.getClassNode().addField(new SField(location, Modifier.STATIC | Modifier.PUBLIC, - bindingName, instanceBinding.targetInstance.getClass())); + + FieldNode fieldNode = new FieldNode(); + fieldNode.setLocation(location); + fieldNode.setModifiers(Modifier.PUBLIC | Modifier.STATIC); + fieldNode.setFieldType(instanceBinding.targetInstance.getClass()); + fieldNode.setName(bindingName); + + classNode.addFieldNode(fieldNode); + scriptRoot.addStaticConstant(bindingName, instanceBinding.targetInstance); } else { throw new IllegalStateException("Illegal tree structure."); } + List argumentOutputs = new ArrayList<>(arguments.size()); // if the class binding is using an implicit this reference then the arguments counted must // be incremented by 1 as the this reference will not be part of the arguments passed into // the class binding call for (int argument = 0; argument < arguments.size(); ++argument) { AExpression expression = arguments.get(argument); - Input expressionInput = new Input(); - expressionInput.expected = typeParameters.get(argument + classBindingOffset); - expressionInput.internal = true; - expression.analyze(scriptRoot, scope, expressionInput); - expression.cast(); + Input argumentInput = new Input(); + argumentInput.expected = typeParameters.get(argument + classBindingOffset); + argumentInput.internal = true; + Output argumentOutput = expression.analyze(classNode, scriptRoot, scope, argumentInput); + expression.cast(argumentInput, argumentOutput); + argumentOutputs.add(argumentOutput); } output.statement = true; - return output; - } - - @Override - MemberCallNode write(ClassNode classNode) { MemberCallNode memberCallNode = new MemberCallNode(); - for (AExpression argument : arguments) { - memberCallNode.addArgumentNode(argument.cast(argument.write(classNode))); + for (int argument = 0; argument < arguments.size(); ++argument) { + memberCallNode.addArgumentNode(arguments.get(argument).cast(argumentOutputs.get(argument))); } memberCallNode.setLocation(location); @@ -173,7 +184,9 @@ MemberCallNode write(ClassNode classNode) { memberCallNode.setBindingName(bindingName); memberCallNode.setInstanceBinding(instanceBinding); - return memberCallNode; + output.expressionNode = memberCallNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ECapturingFunctionRef.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ECapturingFunctionRef.java index fa8b25cd861f9..e5f3b11910d18 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ECapturingFunctionRef.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ECapturingFunctionRef.java @@ -35,11 +35,12 @@ /** * Represents a capturing function reference. For member functions that require a this reference, ie not static. */ -public final class ECapturingFunctionRef extends AExpression implements ILambda { - private final String variable; - private final String call; +public class ECapturingFunctionRef extends AExpression implements ILambda { - private FunctionRef ref; + protected final String variable; + protected final String call; + + // TODO: #54015 private Variable captured; private String defPointer; @@ -51,9 +52,10 @@ public ECapturingFunctionRef(Location location, String variable, String call) { } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + FunctionRef ref = null; + + Output output = new Output(); captured = scope.getVariable(location, variable); if (input.expected == null) { @@ -75,11 +77,6 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { output.actual = input.expected; } - return output; - } - - @Override - CapturingFuncRefNode write(ClassNode classNode) { CapturingFuncRefNode capturingFuncRefNode = new CapturingFuncRefNode(); capturingFuncRefNode.setLocation(location); @@ -87,9 +84,11 @@ CapturingFuncRefNode write(ClassNode classNode) { capturingFuncRefNode.setCapturedName(captured.getName()); capturingFuncRefNode.setName(call); capturingFuncRefNode.setPointer(defPointer); - capturingFuncRefNode.setFuncRef(ref);; + capturingFuncRefNode.setFuncRef(ref); - return capturingFuncRefNode; + output.expressionNode = capturingFuncRefNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EComp.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EComp.java index 32555c18625f7..359546e805a40 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EComp.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EComp.java @@ -34,13 +34,11 @@ /** * Represents a comparison expression. */ -public final class EComp extends AExpression { +public class EComp extends AExpression { - private final Operation operation; - private AExpression left; - private AExpression right; - - private Class promotedType; + protected final Operation operation; + protected final AExpression left; + protected final AExpression right; public EComp(Location location, Operation operation, AExpression left, AExpression right) { super(location); @@ -51,12 +49,16 @@ public EComp(Location location, Operation operation, AExpression left, AExpressi } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Class promotedType; + + Output output = new Output(); + + Input leftInput = new Input(); + Output leftOutput = left.analyze(classNode, scriptRoot, scope, leftInput); - Output leftOutput = left.analyze(scriptRoot, scope, new Input()); - Output rightOutput = right.analyze(scriptRoot, scope, new Input()); + Input rightInput = new Input(); + Output rightOutput = right.analyze(classNode, scriptRoot, scope, rightInput); if (operation == Operation.EQ || operation == Operation.EQR || operation == Operation.NE || operation == Operation.NER) { promotedType = AnalyzerCaster.promoteEquality(leftOutput.actual, rightOutput.actual); @@ -74,11 +76,11 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { } if (operation != Operation.EQR && operation != Operation.NER && promotedType == def.class) { - left.input.expected = leftOutput.actual; - right.input.expected = rightOutput.actual; + leftInput.expected = leftOutput.actual; + rightInput.expected = rightOutput.actual; } else { - left.input.expected = promotedType; - right.input.expected = promotedType; + leftInput.expected = promotedType; + rightInput.expected = promotedType; } if ((operation == Operation.EQ || operation == Operation.EQR || operation == Operation.NE || operation == Operation.NER) @@ -86,27 +88,24 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { throw createError(new IllegalArgumentException("extraneous comparison of [null] constants")); } - left.cast(); - right.cast(); + left.cast(leftInput, leftOutput); + right.cast(rightInput, rightOutput); output.actual = boolean.class; - return output; - } - - @Override - ComparisonNode write(ClassNode classNode) { ComparisonNode comparisonNode = new ComparisonNode(); - comparisonNode.setLeftNode(left.cast(left.write(classNode))); - comparisonNode.setRightNode(right.cast(right.write(classNode))); + comparisonNode.setLeftNode(left.cast(leftOutput)); + comparisonNode.setRightNode(right.cast(rightOutput)); comparisonNode.setLocation(location); comparisonNode.setExpressionType(output.actual); comparisonNode.setComparisonType(promotedType); comparisonNode.setOperation(operation); - return comparisonNode; + output.expressionNode = comparisonNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EConditional.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EConditional.java index d523838ecf66d..2b4440787347b 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EConditional.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EConditional.java @@ -32,11 +32,11 @@ /** * Represents a conditional expression. */ -public final class EConditional extends AExpression { +public class EConditional extends AExpression { - private AExpression condition; - private AExpression left; - private AExpression right; + protected final AExpression condition; + protected final AExpression left; + protected final AExpression right; public EConditional(Location location, AExpression condition, AExpression left, AExpression right) { super(location); @@ -47,30 +47,28 @@ public EConditional(Location location, AExpression condition, AExpression left, } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); Input conditionInput = new Input(); conditionInput.expected = boolean.class; - condition.analyze(scriptRoot, scope, conditionInput); - condition.cast(); + Output conditionOutput = condition.analyze(classNode, scriptRoot, scope, conditionInput); + condition.cast(conditionInput, conditionOutput); Input leftInput = new Input(); leftInput.expected = input.expected; leftInput.explicit = input.explicit; leftInput.internal = input.internal; + Output leftOutput = left.analyze(classNode, scriptRoot, scope, leftInput); Input rightInput = new Input(); rightInput.expected = input.expected; rightInput.explicit = input.explicit; rightInput.internal = input.internal; + Output rightOutput = right.analyze(classNode, scriptRoot, scope, rightInput); output.actual = input.expected; - Output leftOutput = left.analyze(scriptRoot, scope, leftInput); - Output rightOutput = right.analyze(scriptRoot, scope, rightInput); - if (input.expected == null) { Class promote = AnalyzerCaster.promoteConditional(leftOutput.actual, rightOutput.actual); @@ -80,29 +78,26 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { "[" + PainlessLookupUtility.typeToCanonicalTypeName(rightOutput.actual) + "]")); } - left.input.expected = promote; - right.input.expected = promote; + leftInput.expected = promote; + rightInput.expected = promote; output.actual = promote; } - left.cast(); - right.cast(); - - return output; - } + left.cast(leftInput, leftOutput); + right.cast(rightInput, rightOutput); - @Override - ConditionalNode write(ClassNode classNode) { ConditionalNode conditionalNode = new ConditionalNode(); - conditionalNode.setLeftNode(left.cast(left.write(classNode))); - conditionalNode.setRightNode(right.cast(right.write(classNode))); - conditionalNode.setConditionNode(condition.cast(condition.write(classNode))); + conditionalNode.setLeftNode(left.cast(leftOutput)); + conditionalNode.setRightNode(right.cast(rightOutput)); + conditionalNode.setConditionNode(condition.cast(conditionOutput)); conditionalNode.setLocation(location); conditionalNode.setExpressionType(output.actual); - return conditionalNode; + output.expressionNode = conditionalNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EConstant.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EConstant.java index 5caa6bca92895..317a9dae3bd3d 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EConstant.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EConstant.java @@ -30,7 +30,7 @@ * Represents a constant inserted into the tree replacing * other constants during constant folding. (Internal only.) */ -final class EConstant extends AExpression { +public class EConstant extends AExpression { protected Object constant; @@ -41,9 +41,8 @@ final class EConstant extends AExpression { } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); if (constant instanceof String) { output.actual = String.class; @@ -69,17 +68,14 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { "for constant node")); } - return output; - } - - @Override - ConstantNode write(ClassNode classNode) { ConstantNode constantNode = new ConstantNode(); constantNode.setLocation(location); constantNode.setExpressionType(output.actual); constantNode.setConstant(constant); - return constantNode; + output.expressionNode = constantNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EDecimal.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EDecimal.java index 405f90b356f1a..35388db216cb5 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EDecimal.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EDecimal.java @@ -23,7 +23,6 @@ import org.elasticsearch.painless.Scope; import org.elasticsearch.painless.ir.ClassNode; import org.elasticsearch.painless.ir.ConstantNode; -import org.elasticsearch.painless.ir.ExpressionNode; import org.elasticsearch.painless.symbol.ScriptRoot; import java.util.Objects; @@ -31,11 +30,9 @@ /** * Represents a decimal constant. */ -public final class EDecimal extends AExpression { +public class EDecimal extends AExpression { - private final String value; - - protected Object constant; + protected final String value; public EDecimal(Location location, String value) { super(location); @@ -44,9 +41,10 @@ public EDecimal(Location location, String value) { } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Object constant; + + Output output = new Output(); if (input.read == false) { throw createError(new IllegalArgumentException("Must read from constant [" + value + "].")); @@ -72,17 +70,14 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { } } - return output; - } - - @Override - ExpressionNode write(ClassNode classNode) { ConstantNode constantNode = new ConstantNode(); constantNode.setLocation(location); constantNode.setExpressionType(output.actual); constantNode.setConstant(constant); - return constantNode; + output.expressionNode = constantNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EElvis.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EElvis.java index 6c771af47caf4..6ccb1a3f397b8 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EElvis.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EElvis.java @@ -33,8 +33,9 @@ * non null. If the first expression is null then it evaluates the second expression and returns it. */ public class EElvis extends AExpression { - private AExpression lhs; - private AExpression rhs; + + protected AExpression lhs; + protected AExpression rhs; public EElvis(Location location, AExpression lhs, AExpression rhs) { super(location); @@ -44,24 +45,26 @@ public EElvis(Location location, AExpression lhs, AExpression rhs) { } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); if (input.expected != null && input.expected.isPrimitive()) { throw createError(new IllegalArgumentException("Elvis operator cannot return primitives")); } + Input leftInput = new Input(); leftInput.expected = input.expected; leftInput.explicit = input.explicit; leftInput.internal = input.internal; + Output leftOutput = lhs.analyze(classNode, scriptRoot, scope, leftInput); + Input rightInput = new Input(); rightInput.expected = input.expected; rightInput.explicit = input.explicit; rightInput.internal = input.internal; + Output rightOutput = rhs.analyze(classNode, scriptRoot, scope, rightInput); + output.actual = input.expected; - Output leftOutput = lhs.analyze(scriptRoot, scope, leftInput); - Output rightOutput = rhs.analyze(scriptRoot, scope, rightInput); if (lhs instanceof ENull) { throw createError(new IllegalArgumentException("Extraneous elvis operator. LHS is null.")); @@ -83,28 +86,25 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { if (input.expected == null) { Class promote = AnalyzerCaster.promoteConditional(leftOutput.actual, rightOutput.actual); - lhs.input.expected = promote; - rhs.input.expected = promote; + leftInput.expected = promote; + rightInput.expected = promote; output.actual = promote; } - lhs.cast(); - rhs.cast(); - - return output; - } + lhs.cast(leftInput, leftOutput); + rhs.cast(rightInput, rightOutput); - @Override - ElvisNode write(ClassNode classNode) { ElvisNode elvisNode = new ElvisNode(); - elvisNode.setLeftNode(lhs.cast(lhs.write(classNode))); - elvisNode.setRightNode(rhs.cast(rhs.write(classNode))); + elvisNode.setLeftNode(lhs.cast(leftOutput)); + elvisNode.setRightNode(rhs.cast(rightOutput)); elvisNode.setLocation(location); elvisNode.setExpressionType(output.actual); - return elvisNode; + output.expressionNode = elvisNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EExplicit.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EExplicit.java index b76dcd946b3b3..532e9607fb7a6 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EExplicit.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EExplicit.java @@ -22,7 +22,6 @@ import org.elasticsearch.painless.Location; import org.elasticsearch.painless.Scope; import org.elasticsearch.painless.ir.ClassNode; -import org.elasticsearch.painless.ir.ExpressionNode; import org.elasticsearch.painless.symbol.ScriptRoot; import java.util.Objects; @@ -30,10 +29,10 @@ /** * Represents an explicit cast. */ -public final class EExplicit extends AExpression { +public class EExplicit extends AExpression { - private final String type; - private AExpression child; + protected final String type; + protected final AExpression child; public EExplicit(Location location, String type, AExpression child) { super(location); @@ -43,9 +42,8 @@ public EExplicit(Location location, String type, AExpression child) { } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); output.actual = scriptRoot.getPainlessLookup().canonicalTypeNameToType(type); @@ -56,15 +54,12 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { Input childInput = new Input(); childInput.expected = output.actual; childInput.explicit = true; - child.analyze(scriptRoot, scope, childInput); - child.cast(); + Output childOutput = child.analyze(classNode, scriptRoot, scope, childInput); + child.cast(childInput, childOutput); - return output; - } + output.expressionNode = child.cast(childOutput); - @Override - ExpressionNode write(ClassNode classNode) { - return child.cast(child.write(classNode)); + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EFunctionRef.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EFunctionRef.java index 62bb34c215590..a9cb7876fe11a 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EFunctionRef.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EFunctionRef.java @@ -33,11 +33,12 @@ /** * Represents a function reference. */ -public final class EFunctionRef extends AExpression implements ILambda { - private final String type; - private final String call; +public class EFunctionRef extends AExpression implements ILambda { - private FunctionRef ref; + protected final String type; + protected final String call; + + // TODO: #54015 private String defPointer; public EFunctionRef(Location location, String type, String call) { @@ -48,9 +49,10 @@ public EFunctionRef(Location location, String type, String call) { } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + FunctionRef ref; + + Output output = new Output(); if (input.expected == null) { ref = null; @@ -63,18 +65,15 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { output.actual = input.expected; } - return output; - } - - @Override - FuncRefNode write(ClassNode classNode) { FuncRefNode funcRefNode = new FuncRefNode(); funcRefNode.setLocation(location); funcRefNode.setExpressionType(output.actual); funcRefNode.setFuncRef(ref); - return funcRefNode; + output.expressionNode = funcRefNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EInstanceof.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EInstanceof.java index e6ee1a03602b8..9bf0a00e610ca 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EInstanceof.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EInstanceof.java @@ -33,13 +33,10 @@ *

* Unlike java's, this works for primitive types too. */ -public final class EInstanceof extends AExpression { - private AExpression expression; - private final String type; +public class EInstanceof extends AExpression { - private Class resolvedType; - private Class expressionType; - private boolean primitiveExpression; + protected final AExpression expression; + protected final String type; public EInstanceof(Location location, AExpression expression, String type) { super(location); @@ -48,9 +45,12 @@ public EInstanceof(Location location, AExpression expression, String type) { } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Class resolvedType; + Class expressionType; + boolean primitiveExpression; + + Output output = new Output(); // ensure the specified type is part of the definition Class clazz = scriptRoot.getPainlessLookup().canonicalTypeNameToType(this.type); @@ -64,9 +64,10 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { PainlessLookupUtility.typeToJavaType(clazz); // analyze and cast the expression - Output expressionOutput = expression.analyze(scriptRoot, scope, new Input()); - expression.input.expected = expressionOutput.actual; - expression.cast(); + Input expressionInput = new Input(); + Output expressionOutput = expression.analyze(classNode, scriptRoot, scope, expressionInput); + expressionInput.expected = expressionOutput.actual; + expression.cast(expressionInput, expressionOutput); // record if the expression returns a primitive primitiveExpression = expressionOutput.actual.isPrimitive(); @@ -76,14 +77,9 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { output.actual = boolean.class; - return output; - } - - @Override - InstanceofNode write(ClassNode classNode) { InstanceofNode instanceofNode = new InstanceofNode(); - instanceofNode.setChildNode(expression.cast(expression.write(classNode))); + instanceofNode.setChildNode(expression.cast(expressionOutput)); instanceofNode.setLocation(location); instanceofNode.setExpressionType(output.actual); @@ -91,7 +87,9 @@ InstanceofNode write(ClassNode classNode) { instanceofNode.setResolvedType(resolvedType); instanceofNode.setPrimitiveResult(primitiveExpression); - return instanceofNode; + output.expressionNode = instanceofNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ELambda.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ELambda.java index a4cb19d048c62..41ffd62df044f 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ELambda.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ELambda.java @@ -24,6 +24,7 @@ import org.elasticsearch.painless.Scope; import org.elasticsearch.painless.Scope.LambdaScope; import org.elasticsearch.painless.Scope.Variable; +import org.elasticsearch.painless.ir.BlockNode; import org.elasticsearch.painless.ir.ClassNode; import org.elasticsearch.painless.ir.FunctionNode; import org.elasticsearch.painless.ir.LambdaNode; @@ -59,27 +60,16 @@ *
* {@code sort(list, lambda$0(capture))} */ -public final class ELambda extends AExpression implements ILambda { +public class ELambda extends AExpression implements ILambda { - private final List paramTypeStrs; - private final List paramNameStrs; - private final List statements; + protected final List paramTypeStrs; + protected final List paramNameStrs; + protected final List statements; - // captured variables + // TODO: #54015 private List captures; - // static parent, static lambda - private FunctionRef ref; - // dynamic parent, deferred until link time private String defPointer; - private String name; - private Class returnType; - private List> typeParameters; - private List parameterNames; - private SBlock block; - private boolean methodEscape; - private int maxLoopCounter; - public ELambda(Location location, List paramTypes, List paramNames, List statements) { @@ -91,11 +81,18 @@ public ELambda(Location location, } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); - - List> typeParameters = new ArrayList<>(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + String name; + Class returnType; + List> typeParametersWithCaptures; + List parameterNames; + SBlock block; + int maxLoopCounter; + FunctionRef ref; + + Output output = new Output(); + + List> typeParameters; PainlessMethod interfaceMethod; // inspect the target first, set interface method if we know it. if (input.expected == null) { @@ -157,8 +154,8 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { for (int index = 0; index < typeParameters.size(); ++index) { Class type = typeParameters.get(index); - String name = paramNameStrs.get(index); - lambdaScope.defineVariable(location, type, name, true); + String paramName = paramNameStrs.get(index); + lambdaScope.defineVariable(location, type, paramName, true); } block = new SBlock(location, statements); @@ -167,7 +164,7 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { } AStatement.Input blockInput = new AStatement.Input(); blockInput.lastSource = true; - AStatement.Output blockOutput = block.analyze(scriptRoot, lambdaScope, blockInput); + AStatement.Output blockOutput = block.analyze(classNode, scriptRoot, lambdaScope, blockInput); if (blockOutput.methodEscape == false) { throw createError(new IllegalArgumentException("not all paths return a value for lambda")); @@ -177,18 +174,18 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { // prepend capture list to lambda's arguments captures = new ArrayList<>(lambdaScope.getCaptures()); - this.typeParameters = new ArrayList<>(captures.size() + typeParameters.size()); + typeParametersWithCaptures = new ArrayList<>(captures.size() + typeParameters.size()); parameterNames = new ArrayList<>(captures.size() + paramNameStrs.size()); for (Variable var : captures) { - this.typeParameters.add(var.getType()); + typeParametersWithCaptures.add(var.getType()); parameterNames.add(var.getName()); } - this.typeParameters.addAll(typeParameters); + typeParametersWithCaptures.addAll(typeParameters); parameterNames.addAll(paramNameStrs); // desugar lambda body into a synthetic method name = scriptRoot.getNextSyntheticName("lambda"); - scriptRoot.getFunctionTable().addFunction(name, returnType, this.typeParameters, true, true); + scriptRoot.getFunctionTable().addFunction(name, returnType, typeParametersWithCaptures, true, true); // setup method reference to synthetic method if (input.expected == null) { @@ -202,19 +199,12 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { output.actual = input.expected; } - return output; - } - - @Override - LambdaNode write(ClassNode classNode) { FunctionNode functionNode = new FunctionNode(); - - functionNode.setBlockNode(block.write(classNode)); - + functionNode.setBlockNode((BlockNode)blockOutput.statementNode); functionNode.setLocation(location); functionNode.setName(name); functionNode.setReturnType(returnType); - functionNode.getTypeParameters().addAll(typeParameters); + functionNode.getTypeParameters().addAll(typeParametersWithCaptures); functionNode.getParameterNames().addAll(parameterNames); functionNode.setStatic(true); functionNode.setVarArgs(false); @@ -224,7 +214,6 @@ LambdaNode write(ClassNode classNode) { classNode.addFunctionNode(functionNode); LambdaNode lambdaNode = new LambdaNode(); - lambdaNode.setLocation(location); lambdaNode.setExpressionType(output.actual); lambdaNode.setFuncRef(ref); @@ -233,7 +222,9 @@ LambdaNode write(ClassNode classNode) { lambdaNode.addCapture(capture.getName()); } - return lambdaNode; + output.expressionNode = lambdaNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EListInit.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EListInit.java index cb49bce8a11c3..c60a9d1bea78b 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EListInit.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EListInit.java @@ -29,29 +29,28 @@ import org.elasticsearch.painless.symbol.ScriptRoot; import java.util.ArrayList; +import java.util.Collections; import java.util.List; +import java.util.Objects; import static org.elasticsearch.painless.lookup.PainlessLookupUtility.typeToCanonicalTypeName; /** * Represents a list initialization shortcut. */ -public final class EListInit extends AExpression { - private final List values; +public class EListInit extends AExpression { - private PainlessConstructor constructor = null; - private PainlessMethod method = null; + protected final List values; public EListInit(Location location, List values) { super(location); - this.values = values; + this.values = Collections.unmodifiableList(Objects.requireNonNull(values)); } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); if (input.read == false) { throw createError(new IllegalArgumentException("Must read from list initializer.")); @@ -59,38 +58,34 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { output.actual = ArrayList.class; - constructor = scriptRoot.getPainlessLookup().lookupPainlessConstructor(output.actual, 0); + PainlessConstructor constructor = scriptRoot.getPainlessLookup().lookupPainlessConstructor(output.actual, 0); if (constructor == null) { throw createError(new IllegalArgumentException( "constructor [" + typeToCanonicalTypeName(output.actual) + ", /0] not found")); } - method = scriptRoot.getPainlessLookup().lookupPainlessMethod(output.actual, false, "add", 1); + PainlessMethod method = scriptRoot.getPainlessLookup().lookupPainlessMethod(output.actual, false, "add", 1); if (method == null) { throw createError(new IllegalArgumentException("method [" + typeToCanonicalTypeName(output.actual) + ", add/1] not found")); } - for (int index = 0; index < values.size(); ++index) { - AExpression expression = values.get(index); + List valueOutputs = new ArrayList<>(values.size()); + for (AExpression expression : values) { Input expressionInput = new Input(); expressionInput.expected = def.class; expressionInput.internal = true; - expression.analyze(scriptRoot, scope, expressionInput); - expression.cast(); + Output expressionOutput = expression.analyze(classNode, scriptRoot, scope, expressionInput); + expression.cast(expressionInput, expressionOutput); + valueOutputs.add(expressionOutput); } - return output; - } - - @Override - ListInitializationNode write(ClassNode classNode) { ListInitializationNode listInitializationNode = new ListInitializationNode(); - for (AExpression value : values) { - listInitializationNode.addArgumentNode(value.cast(value.write(classNode))); + for (int i = 0; i < values.size(); ++i) { + listInitializationNode.addArgumentNode(values.get(i).cast(valueOutputs.get(i))); } listInitializationNode.setLocation(location); @@ -98,7 +93,9 @@ ListInitializationNode write(ClassNode classNode) { listInitializationNode.setConstructor(constructor); listInitializationNode.setMethod(method); - return listInitializationNode; + output.expressionNode = listInitializationNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EMapInit.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EMapInit.java index f846a18eb89f9..fbe500c027dab 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EMapInit.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EMapInit.java @@ -28,32 +28,32 @@ import org.elasticsearch.painless.lookup.def; import org.elasticsearch.painless.symbol.ScriptRoot; +import java.util.ArrayList; +import java.util.Collections; import java.util.HashMap; import java.util.List; +import java.util.Objects; import static org.elasticsearch.painless.lookup.PainlessLookupUtility.typeToCanonicalTypeName; /** * Represents a map initialization shortcut. */ -public final class EMapInit extends AExpression { - private final List keys; - private final List values; +public class EMapInit extends AExpression { - private PainlessConstructor constructor = null; - private PainlessMethod method = null; + protected final List keys; + protected final List values; public EMapInit(Location location, List keys, List values) { super(location); - this.keys = keys; - this.values = values; + this.keys = Collections.unmodifiableList(Objects.requireNonNull(keys)); + this.values = Collections.unmodifiableList(Objects.requireNonNull(values)); } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); if (input.read == false) { throw createError(new IllegalArgumentException("Must read from map initializer.")); @@ -61,14 +61,14 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { output.actual = HashMap.class; - constructor = scriptRoot.getPainlessLookup().lookupPainlessConstructor(output.actual, 0); + PainlessConstructor constructor = scriptRoot.getPainlessLookup().lookupPainlessConstructor(output.actual, 0); if (constructor == null) { throw createError(new IllegalArgumentException( "constructor [" + typeToCanonicalTypeName(output.actual) + ", /0] not found")); } - method = scriptRoot.getPainlessLookup().lookupPainlessMethod(output.actual, false, "put", 2); + PainlessMethod method = scriptRoot.getPainlessLookup().lookupPainlessMethod(output.actual, false, "put", 2); if (method == null) { throw createError(new IllegalArgumentException("method [" + typeToCanonicalTypeName(output.actual) + ", put/2] not found")); @@ -78,37 +78,33 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { throw createError(new IllegalStateException("Illegal tree structure.")); } - for (int index = 0; index < keys.size(); ++index) { - AExpression expression = keys.get(index); + List keyOutputs = new ArrayList<>(keys.size()); + List valueOutputs = new ArrayList<>(values.size()); + for (int i = 0; i < keys.size(); ++i) { + AExpression expression = keys.get(i); Input expressionInput = new Input(); expressionInput.expected = def.class; expressionInput.internal = true; - expression.analyze(scriptRoot, scope, expressionInput); - expression.cast(); - } - - for (int index = 0; index < values.size(); ++index) { - AExpression expression = values.get(index); + Output expressionOutput = expression.analyze(classNode, scriptRoot, scope, expressionInput); + expression.cast(expressionInput, expressionOutput); + keyOutputs.add(expressionOutput); - Input expressionInput = new Input(); + expression = values.get(i); + expressionInput = new Input(); expressionInput.expected = def.class; expressionInput.internal = true; - expression.analyze(scriptRoot, scope, expressionInput); - expression.cast(); + expressionOutput = expression.analyze(classNode, scriptRoot, scope, expressionInput); + expression.cast(expressionInput, expressionOutput); + valueOutputs.add(expressionOutput); } - return output; - } - - @Override - MapInitializationNode write(ClassNode classNode) { MapInitializationNode mapInitializationNode = new MapInitializationNode(); - for (int index = 0; index < keys.size(); ++index) { + for (int i = 0; i < keys.size(); ++i) { mapInitializationNode.addArgumentNode( - keys.get(index).cast(keys.get(index).write(classNode)), - values.get(index).cast(values.get(index).write(classNode))); + keys.get(i).cast(keyOutputs.get(i)), + values.get(i).cast(valueOutputs.get(i))); } mapInitializationNode.setLocation(location); @@ -116,7 +112,9 @@ MapInitializationNode write(ClassNode classNode) { mapInitializationNode.setConstructor(constructor); mapInitializationNode.setMethod(method); - return mapInitializationNode; + output.expressionNode = mapInitializationNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ENewArray.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ENewArray.java index 73603d79e6c89..a4ce05d57cf98 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ENewArray.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ENewArray.java @@ -25,30 +25,31 @@ import org.elasticsearch.painless.ir.NewArrayNode; import org.elasticsearch.painless.symbol.ScriptRoot; +import java.util.ArrayList; +import java.util.Collections; import java.util.List; import java.util.Objects; /** * Represents an array instantiation. */ -public final class ENewArray extends AExpression { +public class ENewArray extends AExpression { - private final String type; - private final List arguments; - private final boolean initialize; + protected final String type; + protected final List arguments; + protected final boolean initialize; public ENewArray(Location location, String type, List arguments, boolean initialize) { super(location); this.type = Objects.requireNonNull(type); - this.arguments = Objects.requireNonNull(arguments); + this.arguments = Collections.unmodifiableList(Objects.requireNonNull(arguments)); this.initialize = initialize; } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); if (input.read == false) { throw createError(new IllegalArgumentException("A newly created array must be read from.")); @@ -60,34 +61,32 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { throw createError(new IllegalArgumentException("Not a type [" + this.type + "].")); } - for (int argument = 0; argument < arguments.size(); ++argument) { - AExpression expression = arguments.get(argument); + List argumentOutputs = new ArrayList<>(); + for (AExpression expression : arguments) { Input expressionInput = new Input(); expressionInput.expected = initialize ? clazz.getComponentType() : int.class; expressionInput.internal = true; - expression.analyze(scriptRoot, scope, expressionInput); - expression.cast(); + Output expressionOutput = expression.analyze(classNode, scriptRoot, scope, expressionInput); + expression.cast(expressionInput, expressionOutput); + argumentOutputs.add(expressionOutput); } output.actual = clazz; - return output; - } - - @Override - NewArrayNode write(ClassNode classNode) { NewArrayNode newArrayNode = new NewArrayNode(); - for (AExpression argument : arguments) { - newArrayNode.addArgumentNode(argument.cast(argument.write(classNode))); + for (int i = 0; i < arguments.size(); ++ i) { + newArrayNode.addArgumentNode(arguments.get(i).cast(argumentOutputs.get(i))); } newArrayNode.setLocation(location); newArrayNode.setExpressionType(output.actual); newArrayNode.setInitialize(initialize); - return newArrayNode; + output.expressionNode = newArrayNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ENewArrayFunctionRef.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ENewArrayFunctionRef.java index 0a132c12e5adb..16dc2ee729827 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ENewArrayFunctionRef.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ENewArrayFunctionRef.java @@ -23,6 +23,7 @@ import org.elasticsearch.painless.Location; import org.elasticsearch.painless.Scope; import org.elasticsearch.painless.ir.ClassNode; +import org.elasticsearch.painless.ir.FunctionNode; import org.elasticsearch.painless.ir.NewArrayFuncRefNode; import org.elasticsearch.painless.symbol.ScriptRoot; @@ -34,11 +35,11 @@ /** * Represents a function reference. */ -public final class ENewArrayFunctionRef extends AExpression implements ILambda { - private final String type; +public class ENewArrayFunctionRef extends AExpression implements ILambda { - private SFunction function; - private FunctionRef ref; + protected final String type; + + // TODO: #54015 private String defPointer; public ENewArrayFunctionRef(Location location, String type) { @@ -48,19 +49,20 @@ public ENewArrayFunctionRef(Location location, String type) { } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); SReturn code = new SReturn(location, new ENewArray(location, type, Arrays.asList(new EVariable(location, "size")), false)); - function = new SFunction( + SFunction function = new SFunction( location, type, scriptRoot.getNextSyntheticName("newarray"), Collections.singletonList("int"), Collections.singletonList("size"), new SBlock(location, Collections.singletonList(code)), true, true, true, false); function.generateSignature(scriptRoot.getPainlessLookup()); - function.analyze(scriptRoot); + FunctionNode functionNode = function.writeFunction(classNode, scriptRoot); scriptRoot.getFunctionTable().addFunction(function.name, function.returnType, function.typeParameters, true, true); + FunctionRef ref; + if (input.expected == null) { ref = null; output.actual = String.class; @@ -72,12 +74,7 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { output.actual = input.expected; } - return output; - } - - @Override - NewArrayFuncRefNode write(ClassNode classNode) { - classNode.addFunctionNode(function.write(classNode)); + classNode.addFunctionNode(functionNode); NewArrayFuncRefNode newArrayFuncRefNode = new NewArrayFuncRefNode(); @@ -85,7 +82,9 @@ NewArrayFuncRefNode write(ClassNode classNode) { newArrayFuncRefNode.setExpressionType(output.actual); newArrayFuncRefNode.setFuncRef(ref); - return newArrayFuncRefNode; + output.expressionNode = newArrayFuncRefNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ENewObj.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ENewObj.java index dda96482a472d..1659910986173 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ENewObj.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ENewObj.java @@ -28,6 +28,8 @@ import org.elasticsearch.painless.spi.annotation.NonDeterministicAnnotation; import org.elasticsearch.painless.symbol.ScriptRoot; +import java.util.ArrayList; +import java.util.Collections; import java.util.List; import java.util.Objects; @@ -36,24 +38,21 @@ /** * Represents and object instantiation. */ -public final class ENewObj extends AExpression { +public class ENewObj extends AExpression { - private final String type; - private final List arguments; - - private PainlessConstructor constructor; + protected final String type; + protected final List arguments; public ENewObj(Location location, String type, List arguments) { super(location); this.type = Objects.requireNonNull(type); - this.arguments = Objects.requireNonNull(arguments); + this.arguments = Collections.unmodifiableList(Objects.requireNonNull(arguments)); } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); output.actual = scriptRoot.getPainlessLookup().canonicalTypeNameToType(this.type); @@ -61,7 +60,7 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { throw createError(new IllegalArgumentException("Not a type [" + this.type + "].")); } - constructor = scriptRoot.getPainlessLookup().lookupPainlessConstructor(output.actual, arguments.size()); + PainlessConstructor constructor = scriptRoot.getPainlessLookup().lookupPainlessConstructor(output.actual, arguments.size()); if (constructor == null) { throw createError(new IllegalArgumentException( @@ -79,27 +78,25 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { "expected [" + constructor.typeParameters.size() + "] arguments, but found [" + arguments.size() + "].")); } - for (int argument = 0; argument < arguments.size(); ++argument) { - AExpression expression = arguments.get(argument); + List argumentOutputs = new ArrayList<>(); + + for (int i = 0; i < arguments.size(); ++i) { + AExpression expression = arguments.get(i); Input expressionInput = new Input(); - expressionInput.expected = types[argument]; + expressionInput.expected = types[i]; expressionInput.internal = true; - expression.analyze(scriptRoot, scope, expressionInput); - expression.cast(); + Output expressionOutput = expression.analyze(classNode, scriptRoot, scope, expressionInput); + expression.cast(expressionInput, expressionOutput); + argumentOutputs.add(expressionOutput); } output.statement = true; - return output; - } - - @Override - NewObjectNode write(ClassNode classNode) { NewObjectNode newObjectNode = new NewObjectNode(); - for (AExpression argument : arguments) { - newObjectNode.addArgumentNode(argument.cast(argument.write(classNode))); + for (int i = 0; i < arguments.size(); ++ i) { + newObjectNode.addArgumentNode(arguments.get(i).cast(argumentOutputs.get(i))); } newObjectNode.setLocation(location); @@ -107,7 +104,9 @@ NewObjectNode write(ClassNode classNode) { newObjectNode.setRead(input.read); newObjectNode.setConstructor(constructor); - return newObjectNode; + output.expressionNode = newObjectNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ENull.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ENull.java index 841bd09586bed..1b61a05a68501 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ENull.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ENull.java @@ -29,16 +29,15 @@ /** * Represents a null constant. */ -public final class ENull extends AExpression { +public class ENull extends AExpression { public ENull(Location location) { super(location); } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); if (input.read == false) { throw createError(new IllegalArgumentException("Must read from null constant.")); @@ -55,17 +54,14 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { output.actual = Object.class; } - return output; - } - - @Override - NullNode write(ClassNode classNode) { NullNode nullNode = new NullNode(); nullNode.setLocation(location); nullNode.setExpressionType(output.actual); - return nullNode; + output.expressionNode = nullNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ENumeric.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ENumeric.java index f7cbe681b639d..1e548252f1a74 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ENumeric.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ENumeric.java @@ -23,7 +23,6 @@ import org.elasticsearch.painless.Scope; import org.elasticsearch.painless.ir.ClassNode; import org.elasticsearch.painless.ir.ConstantNode; -import org.elasticsearch.painless.ir.ExpressionNode; import org.elasticsearch.painless.symbol.ScriptRoot; import java.util.Objects; @@ -31,12 +30,10 @@ /** * Represents a non-decimal numeric constant. */ -public final class ENumeric extends AExpression { +public class ENumeric extends AExpression { - private final String value; - private int radix; - - protected Object constant; + protected final String value; + protected final int radix; public ENumeric(Location location, String value, int radix) { super(location); @@ -46,9 +43,10 @@ public ENumeric(Location location, String value, int radix) { } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Object constant; + + Output output = new Output(); if (input.read == false) { throw createError(new IllegalArgumentException("Must read from constant [" + value + "].")); @@ -114,17 +112,14 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { } } - return output; - } - - @Override - ExpressionNode write(ClassNode classNode) { ConstantNode constantNode = new ConstantNode(); constantNode.setLocation(location); constantNode.setExpressionType(output.actual); constantNode.setConstant(constant); - return constantNode; + output.expressionNode = constantNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ERegex.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ERegex.java index 3847397407dd8..23e92f018c371 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ERegex.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/ERegex.java @@ -42,11 +42,10 @@ /** * Represents a regex constant. All regexes are constants. */ -public final class ERegex extends AExpression { +public class ERegex extends AExpression { - private final String pattern; - private final int flags; - private String name; + protected final String pattern; + protected final int flags; public ERegex(Location location, String pattern, String flagsString) { super(location); @@ -63,10 +62,8 @@ public ERegex(Location location, String pattern, String flagsString) { } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); - + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); if (scriptRoot.getCompilerSettings().areRegexesEnabled() == false) { throw createError(new IllegalStateException("Regexes are disabled. Set [script.painless.regex.enabled] to [true] " @@ -85,14 +82,9 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { new IllegalArgumentException("Error compiling regex: " + e.getDescription())); } - name = scriptRoot.getNextSyntheticName("regex"); + String name = scriptRoot.getNextSyntheticName("regex"); output.actual = Pattern.class; - return output; - } - - @Override - MemberFieldLoadNode write(ClassNode classNode) { FieldNode fieldNode = new FieldNode(); fieldNode.setLocation(location); fieldNode.setModifiers(Modifier.FINAL | Modifier.STATIC | Modifier.PRIVATE); @@ -169,7 +161,9 @@ MemberFieldLoadNode write(ClassNode classNode) { memberFieldLoadNode.setName(name); memberFieldLoadNode.setStatic(true); - return memberFieldLoadNode; + output.expressionNode = memberFieldLoadNode; + + return output; } private int flagForChar(char c) { diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EStatic.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EStatic.java index 0706e321e5312..b45ba741ae1a3 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EStatic.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EStatic.java @@ -30,9 +30,9 @@ /** * Represents a static type target. */ -public final class EStatic extends AExpression { +public class EStatic extends AExpression { - private final String type; + protected final String type; public EStatic(Location location, String type) { super(location); @@ -41,9 +41,8 @@ public EStatic(Location location, String type) { } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); output.actual = scriptRoot.getPainlessLookup().canonicalTypeNameToType(type); @@ -51,17 +50,14 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { throw createError(new IllegalArgumentException("Not a type [" + type + "].")); } - return output; - } - - @Override - StaticNode write(ClassNode classNode) { StaticNode staticNode = new StaticNode(); staticNode.setLocation(location); staticNode.setExpressionType(output.actual); - return staticNode; + output.expressionNode = staticNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EString.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EString.java index 5cc5b67fe511b..77b9fb857bc08 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EString.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EString.java @@ -23,7 +23,6 @@ import org.elasticsearch.painless.Scope; import org.elasticsearch.painless.ir.ClassNode; import org.elasticsearch.painless.ir.ConstantNode; -import org.elasticsearch.painless.ir.ExpressionNode; import org.elasticsearch.painless.symbol.ScriptRoot; import java.util.Objects; @@ -31,7 +30,7 @@ /** * Represents a string constant. */ -public final class EString extends AExpression { +public class EString extends AExpression { protected String constant; @@ -42,9 +41,8 @@ public EString(Location location, String string) { } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); if (input.read == false) { throw createError(new IllegalArgumentException("Must read from constant [" + constant + "].")); @@ -52,17 +50,14 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { output.actual = String.class; - return output; - } - - @Override - ExpressionNode write(ClassNode classNode) { ConstantNode constantNode = new ConstantNode(); constantNode.setLocation(location); constantNode.setExpressionType(output.actual); constantNode.setConstant(constant); - return constantNode; + output.expressionNode = constantNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EUnary.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EUnary.java index e498ceb3578c8..cafcda3b7dd2a 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EUnary.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EUnary.java @@ -25,7 +25,6 @@ import org.elasticsearch.painless.Scope; import org.elasticsearch.painless.ir.ClassNode; import org.elasticsearch.painless.ir.UnaryMathNode; -import org.elasticsearch.painless.ir.UnaryNode; import org.elasticsearch.painless.lookup.PainlessLookupUtility; import org.elasticsearch.painless.lookup.def; import org.elasticsearch.painless.symbol.ScriptRoot; @@ -35,13 +34,10 @@ /** * Represents a unary math expression. */ -public final class EUnary extends AExpression { +public class EUnary extends AExpression { - private final Operation operation; - private AExpression child; - - private Class promote; - private boolean originallyExplicit = false; // record whether there was originally an explicit cast + protected final Operation operation; + protected final AExpression child; public EUnary(Location location, Operation operation, AExpression child) { super(location); @@ -51,21 +47,24 @@ public EUnary(Location location, Operation operation, AExpression child) { } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); + + Class promote = null; + boolean originallyExplicit = input.explicit; - originallyExplicit = input.explicit; + Input childInput = new Input(); + Output childOutput; if (operation == Operation.NOT) { - Input childInput = new Input(); + childInput.expected = boolean.class; - child.analyze(scriptRoot, scope, childInput); - child.cast(); + childOutput = child.analyze(classNode, scriptRoot, scope, childInput); + child.cast(childInput, childOutput); output.actual = boolean.class; } else if (operation == Operation.BWNOT || operation == Operation.ADD || operation == Operation.SUB) { - Output childOutput = child.analyze(scriptRoot, scope, new Input()); + childOutput = child.analyze(classNode, scriptRoot, scope, new Input()); promote = AnalyzerCaster.promoteNumeric(childOutput.actual, operation != Operation.BWNOT); @@ -75,8 +74,8 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { "[" + PainlessLookupUtility.typeToCanonicalTypeName(childOutput.actual) + "]")); } - child.input.expected = promote; - child.cast(); + childInput.expected = promote; + child.cast(childInput, childOutput); if (promote == def.class && input.expected != null) { output.actual = input.expected; @@ -87,14 +86,9 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { throw createError(new IllegalStateException("unexpected unary operation [" + operation.name + "]")); } - return output; - } - - @Override - UnaryNode write(ClassNode classNode) { UnaryMathNode unaryMathNode = new UnaryMathNode(); - unaryMathNode.setChildNode(child.cast(child.write(classNode))); + unaryMathNode.setChildNode(child.cast(childOutput)); unaryMathNode.setLocation(location); unaryMathNode.setExpressionType(output.actual); @@ -102,7 +96,9 @@ UnaryNode write(ClassNode classNode) { unaryMathNode.setOperation(operation); unaryMathNode.setOriginallExplicit(originallyExplicit); - return unaryMathNode; + output.expressionNode = unaryMathNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EVariable.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EVariable.java index 2f1a4d6f09a72..0c345dfa46735 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EVariable.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/EVariable.java @@ -31,9 +31,9 @@ /** * Represents a variable load/store. */ -public final class EVariable extends AStoreable { +public class EVariable extends AStoreable { - private final String name; + protected final String name; public EVariable(Location location, String name) { super(location); @@ -42,20 +42,19 @@ public EVariable(Location location, String name) { } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, AExpression.Input input) { + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, AExpression.Input input) { AStoreable.Input storeableInput = new AStoreable.Input(); storeableInput.read = input.read; storeableInput.expected = input.expected; storeableInput.explicit = input.explicit; storeableInput.internal = input.internal; - return analyze(scriptRoot, scope, storeableInput); + return analyze(classNode, scriptRoot, scope, storeableInput); } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, AStoreable.Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, AStoreable.Input input) { + Output output = new Output(); Variable variable = scope.getVariable(location, name); @@ -65,18 +64,15 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, AStoreable.Input input) { output.actual = variable.getType(); - return output; - } - - @Override - VariableNode write(ClassNode classNode) { VariableNode variableNode = new VariableNode(); variableNode.setLocation(location); variableNode.setExpressionType(output.actual); variableNode.setName(name); - return variableNode; + output.expressionNode = variableNode; + + return output; } @Override @@ -84,11 +80,6 @@ boolean isDefOptimized() { return false; } - @Override - void updateActual(Class actual) { - throw new IllegalArgumentException("Illegal tree structure."); - } - @Override public String toString() { return singleLineToString(name); diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PBrace.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PBrace.java index 684b015aa835e..664a9e1c0048a 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PBrace.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PBrace.java @@ -34,11 +34,12 @@ /** * Represents an array load/store and defers to a child subnode. */ -public final class PBrace extends AStoreable { +public class PBrace extends AStoreable { - private AExpression index; + protected final AExpression index; - private AStoreable sub = null; + // TODO: #54015 + private boolean isDefOptimized = false; public PBrace(Location location, AExpression prefix, AExpression index) { super(location, prefix); @@ -47,24 +48,26 @@ public PBrace(Location location, AExpression prefix, AExpression index) { } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, AExpression.Input input) { + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, AExpression.Input input) { AStoreable.Input storeableInput = new AStoreable.Input(); storeableInput.read = input.read; storeableInput.expected = input.expected; storeableInput.explicit = input.explicit; storeableInput.internal = input.internal; - return analyze(scriptRoot, scope, storeableInput); + return analyze(classNode, scriptRoot, scope, storeableInput); } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, AStoreable.Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, AStoreable.Input input) { + Output output = new Output(); - Output prefixOutput = prefix.analyze(scriptRoot, scope, new Input()); - prefix.input.expected = prefixOutput.actual; - prefix.cast(); + Input prefixInput = new Input(); + Output prefixOutput = prefix.analyze(classNode, scriptRoot, scope, prefixInput); + prefixInput.expected = prefixOutput.actual; + prefix.cast(prefixInput, prefixOutput); + + AStoreable sub; if (prefixOutput.actual.isArray()) { sub = new PSubBrace(location, prefixOutput.actual, index); @@ -79,39 +82,32 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, AStoreable.Input input) { "[" + PainlessLookupUtility.typeToCanonicalTypeName(prefixOutput.actual) + "].")); } + isDefOptimized = sub.isDefOptimized(); + Input subInput = new Input(); subInput.write = input.write; subInput.read = input.read; subInput.expected = input.expected; subInput.explicit = input.explicit; - Output subOutput = sub.analyze(scriptRoot, scope, subInput); + Output subOutput = sub.analyze(classNode, scriptRoot, scope, subInput); output.actual = subOutput.actual; - return output; - } - - @Override - BraceNode write(ClassNode classNode) { BraceNode braceNode = new BraceNode(); - braceNode.setLeftNode(prefix.cast(prefix.write(classNode))); - braceNode.setRightNode(sub.write(classNode)); + braceNode.setLeftNode(prefix.cast(prefixOutput)); + braceNode.setRightNode(subOutput.expressionNode); braceNode.setLocation(location); braceNode.setExpressionType(output.actual); - return braceNode; - } + output.expressionNode = braceNode; - @Override - boolean isDefOptimized() { - return sub.isDefOptimized(); + return output; } @Override - void updateActual(Class actual) { - sub.updateActual(actual); - this.output.actual = actual; + boolean isDefOptimized() { + return isDefOptimized; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PCallInvoke.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PCallInvoke.java index 4a0254ef96253..036c04f3878b5 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PCallInvoke.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PCallInvoke.java @@ -28,6 +28,7 @@ import org.elasticsearch.painless.spi.annotation.NonDeterministicAnnotation; import org.elasticsearch.painless.symbol.ScriptRoot; +import java.util.Collections; import java.util.List; import java.util.Objects; @@ -36,30 +37,30 @@ /** * Represents a method call and defers to a child subnode. */ -public final class PCallInvoke extends AExpression { +public class PCallInvoke extends AExpression { - private final String name; - private final boolean nullSafe; - private final List arguments; - - private AExpression sub = null; + protected final String name; + protected final boolean nullSafe; + protected final List arguments; public PCallInvoke(Location location, AExpression prefix, String name, boolean nullSafe, List arguments) { super(location, prefix); this.name = Objects.requireNonNull(name); this.nullSafe = nullSafe; - this.arguments = Objects.requireNonNull(arguments); + this.arguments = Collections.unmodifiableList(Objects.requireNonNull(arguments)); } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); + + Input prefixInput = new Input(); + Output prefixOutput = prefix.analyze(classNode, scriptRoot, scope, prefixInput); + prefixInput.expected = prefixOutput.actual; + prefix.cast(prefixInput, prefixOutput); - Output prefixOutput = prefix.analyze(scriptRoot, scope, new Input()); - prefix.input.expected = prefixOutput.actual; - prefix.cast(); + AExpression sub; if (prefixOutput.actual == def.class) { sub = new PSubDefCall(location, name, arguments); @@ -84,25 +85,22 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { Input subInput = new Input(); subInput.expected = input.expected; subInput.explicit = input.explicit; - Output subOutput = sub.analyze(scriptRoot, scope, subInput); + Output subOutput = sub.analyze(classNode, scriptRoot, scope, subInput); output.actual = subOutput.actual; output.statement = true; - return output; - } - - @Override - CallNode write(ClassNode classNode) { CallNode callNode = new CallNode(); - callNode.setLeftNode(prefix.cast(prefix.write(classNode))); - callNode.setRightNode(sub.write(classNode)); + callNode.setLeftNode(prefix.cast(prefixOutput)); + callNode.setRightNode(subOutput.expressionNode); callNode.setLocation(location); callNode.setExpressionType(output.actual); - return callNode; + output.expressionNode = callNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PField.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PField.java index 2380aa536edde..9751786e1515b 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PField.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PField.java @@ -38,12 +38,13 @@ /** * Represents a field load/store and defers to a child subnode. */ -public final class PField extends AStoreable { +public class PField extends AStoreable { - private final boolean nullSafe; - private final String value; + protected final boolean nullSafe; + protected final String value; - private AStoreable sub = null; + // TODO: #54015 + private boolean isDefOptimized; public PField(Location location, AExpression prefix, boolean nullSafe, String value) { super(location, prefix); @@ -53,24 +54,26 @@ public PField(Location location, AExpression prefix, boolean nullSafe, String va } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, AExpression.Input input) { + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, AExpression.Input input) { AStoreable.Input storeableInput = new AStoreable.Input(); storeableInput.read = input.read; storeableInput.expected = input.expected; storeableInput.explicit = input.explicit; storeableInput.internal = input.internal; - return analyze(scriptRoot, scope, storeableInput); + return analyze(classNode, scriptRoot, scope, storeableInput); } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, AStoreable.Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, AStoreable.Input input) { + Output output = new Output(); - Output prefixOutput = prefix.analyze(scriptRoot, scope, new Input()); - prefix.input.expected = prefixOutput.actual; - prefix.cast(); + Input prefixInput = new Input(); + Output prefixOutput = prefix.analyze(classNode, scriptRoot, scope, prefixInput); + prefixInput.expected = prefixOutput.actual; + prefix.cast(prefixInput, prefixOutput); + + AStoreable sub = null; if (prefixOutput.actual.isArray()) { sub = new PSubArrayLength(location, PainlessLookupUtility.typeToCanonicalTypeName(prefixOutput.actual), value); @@ -99,7 +102,7 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, AStoreable.Input input) { location, value, PainlessLookupUtility.typeToCanonicalTypeName(prefixOutput.actual), getter, setter); } else { EConstant index = new EConstant(location, value); - index.analyze(scriptRoot, scope, new Input()); + index.analyze(classNode, scriptRoot, scope, new Input()); if (Map.class.isAssignableFrom(prefixOutput.actual)) { sub = new PSubMapShortcut(location, prefixOutput.actual, index); @@ -119,6 +122,8 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, AStoreable.Input input) { } } + isDefOptimized = sub.isDefOptimized(); + if (nullSafe) { sub = new PSubNullSafeField(location, sub); } @@ -128,34 +133,25 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, AStoreable.Input input) { subInput.read = input.read; subInput.expected = input.expected; subInput.explicit = input.explicit; - Output subOutput = sub.analyze(scriptRoot, scope, subInput); + Output subOutput = sub.analyze(classNode, scriptRoot, scope, subInput); output.actual = subOutput.actual; - return output; - } - - @Override - DotNode write(ClassNode classNode) { DotNode dotNode = new DotNode(); - dotNode.setLeftNode(prefix.cast(prefix.write(classNode))); - dotNode.setRightNode(sub.write(classNode)); + dotNode.setLeftNode(prefix.cast(prefixOutput)); + dotNode.setRightNode(subOutput.expressionNode); dotNode.setLocation(location); dotNode.setExpressionType(output.actual); - return dotNode; - } + output.expressionNode = dotNode; - @Override - boolean isDefOptimized() { - return sub.isDefOptimized(); + return output; } @Override - void updateActual(Class actual) { - sub.updateActual(actual); - this.output.actual = actual; + boolean isDefOptimized() { + return isDefOptimized; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubArrayLength.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubArrayLength.java index 43cdba046bd54..aaefa0717d3ba 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubArrayLength.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubArrayLength.java @@ -30,10 +30,10 @@ /** * Represents an array length field load. */ -final class PSubArrayLength extends AStoreable { +public class PSubArrayLength extends AStoreable { - private final String type; - private final String value; + protected final String type; + protected final String value; PSubArrayLength(Location location, String type, String value) { super(location); @@ -43,9 +43,8 @@ final class PSubArrayLength extends AStoreable { } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, AStoreable.Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, AStoreable.Input input) { + Output output = new Output(); if ("length".equals(value)) { if (input.write) { @@ -57,27 +56,19 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, AStoreable.Input input) { throw createError(new IllegalArgumentException("Field [" + value + "] does not exist for type [" + type + "].")); } - return output; - } - - @Override - DotSubArrayLengthNode write(ClassNode classNode) { DotSubArrayLengthNode dotSubArrayLengthNode = new DotSubArrayLengthNode(); dotSubArrayLengthNode.setLocation(location); dotSubArrayLengthNode.setExpressionType(output.actual); - return dotSubArrayLengthNode; - } + output.expressionNode = dotSubArrayLengthNode; - @Override - boolean isDefOptimized() { - throw new IllegalStateException("Illegal tree structure."); + return output; } @Override - void updateActual(Class actual) { - throw new IllegalStateException("Illegal tree structure."); + boolean isDefOptimized() { + return false; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubBrace.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubBrace.java index 0303ce051f7f5..2a285d49acbe2 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubBrace.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubBrace.java @@ -30,10 +30,10 @@ /** * Represents an array load/store. */ -final class PSubBrace extends AStoreable { +public class PSubBrace extends AStoreable { - private final Class clazz; - private AExpression index; + protected final Class clazz; + protected final AExpression index; PSubBrace(Location location, Class clazz, AExpression index) { super(location); @@ -43,29 +43,26 @@ final class PSubBrace extends AStoreable { } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, AStoreable.Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, AStoreable.Input input) { + Output output = new Output(); Input indexInput = new Input(); indexInput.expected = int.class; - index.analyze(scriptRoot, scope, indexInput); - index.cast(); + Output indexOutput = index.analyze(classNode, scriptRoot, scope, indexInput); + index.cast(indexInput, indexOutput); output.actual = clazz.getComponentType(); - return output; - } - - BraceSubNode write(ClassNode classNode) { BraceSubNode braceSubNode = new BraceSubNode(); - braceSubNode.setChildNode(index.cast(index.write(classNode))); + braceSubNode.setChildNode(index.cast(indexOutput)); braceSubNode.setLocation(location); braceSubNode.setExpressionType(output.actual); - return braceSubNode; + output.expressionNode = braceSubNode; + + return output; } @Override @@ -73,11 +70,6 @@ boolean isDefOptimized() { return false; } - @Override - void updateActual(Class actual) { - throw createError(new IllegalStateException("Illegal tree structure.")); - } - @Override public String toString() { return singleLineToString(prefix, index); diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubCallInvoke.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubCallInvoke.java index 7e509682f7d47..721e76cd3e500 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubCallInvoke.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubCallInvoke.java @@ -26,30 +26,33 @@ import org.elasticsearch.painless.lookup.PainlessMethod; import org.elasticsearch.painless.symbol.ScriptRoot; +import java.util.ArrayList; +import java.util.Collections; import java.util.List; import java.util.Objects; /** * Represents a method call. */ -final class PSubCallInvoke extends AExpression { +public class PSubCallInvoke extends AExpression { - private final PainlessMethod method; - private final Class box; - private final List arguments; + protected final PainlessMethod method; + protected final Class box; + protected final List arguments; PSubCallInvoke(Location location, PainlessMethod method, Class box, List arguments) { super(location); this.method = Objects.requireNonNull(method); this.box = box; - this.arguments = Objects.requireNonNull(arguments); + this.arguments = Collections.unmodifiableList(Objects.requireNonNull(arguments)); } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); + + List argumentOutputs = new ArrayList<>(); for (int argument = 0; argument < arguments.size(); ++argument) { AExpression expression = arguments.get(argument); @@ -57,22 +60,18 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { Input expressionInput = new Input(); expressionInput.expected = method.typeParameters.get(argument); expressionInput.internal = true; - expression.analyze(scriptRoot, scope, expressionInput); - expression.cast(); + Output expressionOutput = expression.analyze(classNode, scriptRoot, scope, expressionInput); + expression.cast(expressionInput, expressionOutput); + argumentOutputs.add(expressionOutput); } output.statement = true; output.actual = method.returnType; - return output; - } - - @Override - CallSubNode write(ClassNode classNode) { CallSubNode callSubNode = new CallSubNode(); - for (AExpression argument : arguments) { - callSubNode.addArgumentNode(argument.cast(argument.write(classNode))); + for (int argument = 0; argument < arguments.size(); ++ argument) { + callSubNode.addArgumentNode(arguments.get(argument).cast(argumentOutputs.get(argument))); } callSubNode.setLocation(location); @@ -80,7 +79,9 @@ CallSubNode write(ClassNode classNode) { callSubNode.setMethod(method); callSubNode .setBox(box); - return callSubNode; + output.expressionNode = callSubNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubDefArray.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubDefArray.java index 060102ae8593f..00f09a493c0e9 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubDefArray.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubDefArray.java @@ -32,8 +32,9 @@ /** * Represents an array load/store or shortcut on a def type. (Internal only.) */ -final class PSubDefArray extends AStoreable { - private AExpression index; +public class PSubDefArray extends AStoreable { + + protected AExpression index; PSubDefArray(Location location, AExpression index) { super(location); @@ -42,30 +43,27 @@ final class PSubDefArray extends AStoreable { } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, AStoreable.Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, AStoreable.Input input) { + Output output = new Output(); - Output indexOutput = index.analyze(scriptRoot, scope, new Input()); - index.input.expected = indexOutput.actual; - index.cast(); + Input indexInput = new Input(); + Output indexOutput = index.analyze(classNode, scriptRoot, scope, indexInput); + indexInput.expected = indexOutput.actual; + index.cast(indexInput, indexOutput); // TODO: remove ZonedDateTime exception when JodaCompatibleDateTime is removed output.actual = input.expected == null || input.expected == ZonedDateTime.class || input.explicit ? def.class : input.expected; - return output; - } - - @Override - BraceSubDefNode write(ClassNode classNode) { BraceSubDefNode braceSubDefNode = new BraceSubDefNode(); - braceSubDefNode.setChildNode(index.cast(index.write(classNode))); + braceSubDefNode.setChildNode(index.cast(indexOutput)); braceSubDefNode.setLocation(location); braceSubDefNode.setExpressionType(output.actual); - return braceSubDefNode; + output.expressionNode = braceSubDefNode; + + return output; } @Override @@ -73,11 +71,6 @@ boolean isDefOptimized() { return true; } - @Override - void updateActual(Class actual) { - this.output.actual = actual; - } - @Override public String toString() { return singleLineToString(prefix, index); diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubDefCall.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubDefCall.java index f810f1a634b6e..8339d161f2a84 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubDefCall.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubDefCall.java @@ -28,51 +28,55 @@ import java.time.ZonedDateTime; import java.util.ArrayList; +import java.util.Collections; import java.util.List; import java.util.Objects; /** * Represents a method call made on a def type. (Internal only.) */ -final class PSubDefCall extends AExpression { +public class PSubDefCall extends AExpression { - private final String name; - private final List arguments; - - private final StringBuilder recipe = new StringBuilder(); - private final List pointers = new ArrayList<>(); - private final List> parameterTypes = new ArrayList<>(); + protected final String name; + protected final List arguments; PSubDefCall(Location location, String name, List arguments) { super(location); this.name = Objects.requireNonNull(name); - this.arguments = Objects.requireNonNull(arguments); + this.arguments = Collections.unmodifiableList(Objects.requireNonNull(arguments)); } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); + + StringBuilder recipe = new StringBuilder(); + List pointers = new ArrayList<>(); + List> parameterTypes = new ArrayList<>(); parameterTypes.add(Object.class); int totalCaptures = 0; + List argumentOutputs = new ArrayList<>(arguments.size()); + for (int argument = 0; argument < arguments.size(); ++argument) { AExpression expression = arguments.get(argument); Input expressionInput = new Input(); expressionInput.internal = true; - Output expressionOutput = expression.analyze(scriptRoot, scope, expressionInput); + Output expressionOutput = expression.analyze(classNode, scriptRoot, scope, expressionInput); + argumentOutputs.add(expressionOutput); if (expressionOutput.actual == void.class) { throw createError(new IllegalArgumentException("Argument(s) cannot be of [void] type when calling method [" + name + "].")); } - expression.input.expected = expressionOutput.actual; - expression.cast(); + expressionInput.expected = expressionOutput.actual; + expression.cast(expressionInput, expressionOutput); parameterTypes.add(expressionOutput.actual); + // TODO: #54015 if (expression instanceof ILambda) { ILambda lambda = (ILambda) expression; pointers.add(lambda.getPointer()); @@ -87,15 +91,10 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { // TODO: remove ZonedDateTime exception when JodaCompatibleDateTime is removed output.actual = input.expected == null || input.expected == ZonedDateTime.class || input.explicit ? def.class : input.expected; - return output; - } - - @Override - CallSubDefNode write(ClassNode classNode) { CallSubDefNode callSubDefNode = new CallSubDefNode(); - for (AExpression argument : arguments) { - callSubDefNode.addArgumentNode(argument.cast(argument.write(classNode))); + for (int argument = 0; argument < arguments.size(); ++ argument) { + callSubDefNode.addArgumentNode(arguments.get(argument).cast(argumentOutputs.get(argument))); } callSubDefNode.setLocation(location); @@ -105,7 +104,9 @@ CallSubDefNode write(ClassNode classNode) { callSubDefNode.getPointers().addAll(pointers); callSubDefNode.getTypeParameters().addAll(parameterTypes); - return callSubDefNode; + output.expressionNode = callSubDefNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubDefField.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubDefField.java index 75bfa5c5755cc..1cb8b2a0119e3 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubDefField.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubDefField.java @@ -32,9 +32,9 @@ /** * Represents a field load/store or shortcut on a def type. (Internal only.) */ -final class PSubDefField extends AStoreable { +public class PSubDefField extends AStoreable { - private final String value; + protected final String value; PSubDefField(Location location, String value) { super(location); @@ -43,25 +43,21 @@ final class PSubDefField extends AStoreable { } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, AStoreable.Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, AStoreable.Input input) { + Output output = new Output(); // TODO: remove ZonedDateTime exception when JodaCompatibleDateTime is removed output.actual = input.expected == null || input.expected == ZonedDateTime.class || input.explicit ? def.class : input.expected; - return output; - } - - @Override - DotSubDefNode write(ClassNode classNode) { DotSubDefNode dotSubDefNode = new DotSubDefNode(); dotSubDefNode.setLocation(location); dotSubDefNode.setExpressionType(output.actual); dotSubDefNode.setValue(value); - return dotSubDefNode; + output.expressionNode = dotSubDefNode; + + return output; } @Override @@ -69,11 +65,6 @@ boolean isDefOptimized() { return true; } - @Override - void updateActual(Class actual) { - this.output.actual = actual; - } - @Override public String toString() { return singleLineToString(prefix, value); diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubField.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubField.java index c9a438571bc36..26d2858f54afb 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubField.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubField.java @@ -33,9 +33,9 @@ /** * Represents a field load/store. */ -final class PSubField extends AStoreable { +public class PSubField extends AStoreable { - private final PainlessField field; + protected final PainlessField field; PSubField(Location location, PainlessField field) { super(location); @@ -44,9 +44,8 @@ final class PSubField extends AStoreable { } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, AStoreable.Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, AStoreable.Input input) { + Output output = new Output(); if (input.write && Modifier.isFinal(field.javaField.getModifiers())) { throw createError(new IllegalArgumentException("Cannot write to read-only field [" + field.javaField.getName() + "] " + @@ -55,18 +54,15 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, AStoreable.Input input) { output.actual = field.typeParameter; - return output; - } - - @Override - DotSubNode write(ClassNode classNode) { DotSubNode dotSubNode = new DotSubNode(); dotSubNode.setLocation(location); dotSubNode.setExpressionType(output.actual); dotSubNode.setField(field); - return dotSubNode; + output.expressionNode = dotSubNode; + + return output; } @Override @@ -74,11 +70,6 @@ boolean isDefOptimized() { return false; } - @Override - void updateActual(Class actual) { - throw new IllegalArgumentException("Illegal tree structure."); - } - @Override public String toString() { return singleLineToString(prefix, field.javaField.getName()); diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubListShortcut.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubListShortcut.java index 4dd3b46ceaa10..d23039453b3f8 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubListShortcut.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubListShortcut.java @@ -32,13 +32,10 @@ /** * Represents a list load/store shortcut. (Internal only.) */ -final class PSubListShortcut extends AStoreable { +public class PSubListShortcut extends AStoreable { - private final Class targetClass; - private AExpression index; - - private PainlessMethod getter; - private PainlessMethod setter; + protected final Class targetClass; + protected final AExpression index; PSubListShortcut(Location location, Class targetClass, AExpression index) { super(location); @@ -48,14 +45,13 @@ final class PSubListShortcut extends AStoreable { } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, AStoreable.Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, AStoreable.Input input) { + Output output = new Output(); String canonicalClassName = PainlessLookupUtility.typeToCanonicalTypeName(targetClass); - getter = scriptRoot.getPainlessLookup().lookupPainlessMethod(targetClass, false, "get", 1); - setter = scriptRoot.getPainlessLookup().lookupPainlessMethod(targetClass, false, "set", 2); + PainlessMethod getter = scriptRoot.getPainlessLookup().lookupPainlessMethod(targetClass, false, "get", 1); + PainlessMethod setter = scriptRoot.getPainlessLookup().lookupPainlessMethod(targetClass, false, "set", 2); if (getter != null && (getter.returnType == void.class || getter.typeParameters.size() != 1 || getter.typeParameters.get(0) != int.class)) { @@ -71,32 +67,31 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, AStoreable.Input input) { throw createError(new IllegalArgumentException("Shortcut argument types must match.")); } + Output indexOutput = new Output(); + if ((input.read || input.write) && (input.read == false || getter != null) && (input.write == false || setter != null)) { Input indexInput = new Input(); indexInput.expected = int.class; - index.analyze(scriptRoot, scope, indexInput); - index.cast(); + indexOutput = index.analyze(classNode, scriptRoot, scope, indexInput); + index.cast(indexInput, indexOutput); output.actual = setter != null ? setter.typeParameters.get(1) : getter.returnType; } else { throw createError(new IllegalArgumentException("Illegal list shortcut for type [" + canonicalClassName + "].")); } - return output; - } - - @Override - ListSubShortcutNode write(ClassNode classNode) { ListSubShortcutNode listSubShortcutNode = new ListSubShortcutNode(); - listSubShortcutNode.setChildNode(index.cast(index.write(classNode))); + listSubShortcutNode.setChildNode(index.cast(indexOutput)); listSubShortcutNode.setLocation(location); listSubShortcutNode.setExpressionType(output.actual); listSubShortcutNode.setGetter(getter); listSubShortcutNode.setSetter(setter); - return listSubShortcutNode; + output.expressionNode = listSubShortcutNode; + + return output; } @Override @@ -104,11 +99,6 @@ boolean isDefOptimized() { return false; } - @Override - void updateActual(Class actual) { - throw new IllegalArgumentException("Illegal tree structure."); - } - @Override public String toString() { return singleLineToString(prefix, index); diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubMapShortcut.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubMapShortcut.java index 707609e5a4947..4105bcec8b261 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubMapShortcut.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubMapShortcut.java @@ -32,13 +32,10 @@ /** * Represents a map load/store shortcut. (Internal only.) */ -final class PSubMapShortcut extends AStoreable { +public class PSubMapShortcut extends AStoreable { - private final Class targetClass; - private AExpression index; - - private PainlessMethod getter; - private PainlessMethod setter; + protected final Class targetClass; + protected final AExpression index; PSubMapShortcut(Location location, Class targetClass, AExpression index) { super(location); @@ -48,14 +45,13 @@ final class PSubMapShortcut extends AStoreable { } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, AStoreable.Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, AStoreable.Input input) { + Output output = new Output(); String canonicalClassName = PainlessLookupUtility.typeToCanonicalTypeName(targetClass); - getter = scriptRoot.getPainlessLookup().lookupPainlessMethod(targetClass, false, "get", 1); - setter = scriptRoot.getPainlessLookup().lookupPainlessMethod(targetClass, false, "put", 2); + PainlessMethod getter = scriptRoot.getPainlessLookup().lookupPainlessMethod(targetClass, false, "get", 1); + PainlessMethod setter = scriptRoot.getPainlessLookup().lookupPainlessMethod(targetClass, false, "put", 2); if (getter != null && (getter.returnType == void.class || getter.typeParameters.size() != 1)) { throw createError(new IllegalArgumentException("Illegal map get shortcut for type [" + canonicalClassName + "].")); @@ -70,32 +66,31 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, AStoreable.Input input) { throw createError(new IllegalArgumentException("Shortcut argument types must match.")); } + Output indexOutput; + if ((input.read || input.write) && (input.read == false || getter != null) && (input.write == false || setter != null)) { Input indexInput = new Input(); indexInput.expected = setter != null ? setter.typeParameters.get(0) : getter.typeParameters.get(0); - index.analyze(scriptRoot, scope, indexInput); - index.cast(); + indexOutput = index.analyze(classNode, scriptRoot, scope, indexInput); + index.cast(indexInput, indexOutput); output.actual = setter != null ? setter.typeParameters.get(1) : getter.returnType; } else { throw createError(new IllegalArgumentException("Illegal map shortcut for type [" + canonicalClassName + "].")); } - return output; - } - - @Override - MapSubShortcutNode write(ClassNode classNode) { MapSubShortcutNode mapSubShortcutNode = new MapSubShortcutNode(); - mapSubShortcutNode.setChildNode(index.cast(index.write(classNode))); + mapSubShortcutNode.setChildNode(index.cast(indexOutput)); mapSubShortcutNode.setLocation(location); mapSubShortcutNode.setExpressionType(output.actual); mapSubShortcutNode.setGetter(getter); mapSubShortcutNode.setSetter(setter); - return mapSubShortcutNode; + output.expressionNode = mapSubShortcutNode; + + return output; } @Override @@ -103,11 +98,6 @@ boolean isDefOptimized() { return false; } - @Override - void updateActual(Class actual) { - throw new IllegalArgumentException("Illegal tree structure."); - } - @Override public String toString() { return singleLineToString(prefix, index); diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubNullSafeCallInvoke.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubNullSafeCallInvoke.java index 83d5fded5649d..bc9a259a2e96c 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubNullSafeCallInvoke.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubNullSafeCallInvoke.java @@ -31,10 +31,11 @@ * Implements a call who's value is null if the prefix is null rather than throwing an NPE. */ public class PSubNullSafeCallInvoke extends AExpression { + /** - * The expression gaurded by the null check. Required at construction time and replaced at analysis time. + * The expression guarded by the null check. Required at construction time and replaced at analysis time. */ - private AExpression guarded; + protected final AExpression guarded; public PSubNullSafeCallInvoke(Location location, AExpression guarded) { super(location); @@ -42,29 +43,25 @@ public PSubNullSafeCallInvoke(Location location, AExpression guarded) { } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); - Output guardedOutput = guarded.analyze(scriptRoot, scope, new Input()); + Output guardedOutput = guarded.analyze(classNode, scriptRoot, scope, new Input()); output.actual = guardedOutput.actual; if (output.actual.isPrimitive()) { throw new IllegalArgumentException("Result of null safe operator must be nullable"); } - return output; - } - - @Override - NullSafeSubNode write(ClassNode classNode) { NullSafeSubNode nullSafeSubNode = new NullSafeSubNode(); - nullSafeSubNode.setChildNode(guarded.write(classNode)); + nullSafeSubNode.setChildNode(guardedOutput.expressionNode); nullSafeSubNode.setLocation(location); nullSafeSubNode.setExpressionType(output.actual); - return nullSafeSubNode; + output.expressionNode = nullSafeSubNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubNullSafeField.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubNullSafeField.java index 536c8b15e83c6..22d4c91945273 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubNullSafeField.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubNullSafeField.java @@ -29,7 +29,8 @@ * Implements a field who's value is null if the prefix is null rather than throwing an NPE. */ public class PSubNullSafeField extends AStoreable { - private AStoreable guarded; + + protected final AStoreable guarded; public PSubNullSafeField(Location location, AStoreable guarded) { super(location); @@ -37,44 +38,37 @@ public PSubNullSafeField(Location location, AStoreable guarded) { } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, AStoreable.Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, AStoreable.Input input) { + Output output = new Output(); if (input.write) { throw createError(new IllegalArgumentException("Can't write to null safe reference")); } + Input guardedInput = new Input(); guardedInput.read = input.read; - Output guardedOutput = guarded.analyze(scriptRoot, scope, guardedInput); + Output guardedOutput = guarded.analyze(classNode, scriptRoot, scope, guardedInput); output.actual = guardedOutput.actual; + if (output.actual.isPrimitive()) { throw new IllegalArgumentException("Result of null safe operator must be nullable"); } - return output; - } - - @Override - boolean isDefOptimized() { - return guarded.isDefOptimized(); - } - - @Override - void updateActual(Class actual) { - guarded.updateActual(actual); - } - - @Override - NullSafeSubNode write(ClassNode classNode) { NullSafeSubNode nullSafeSubNode = new NullSafeSubNode(); - nullSafeSubNode.setChildNode(guarded.write(classNode)); + nullSafeSubNode.setChildNode(guardedOutput.expressionNode); nullSafeSubNode.setLocation(location); nullSafeSubNode.setExpressionType(output.actual); - return nullSafeSubNode; + output.expressionNode = nullSafeSubNode; + + return output; + } + + @Override + boolean isDefOptimized() { + return false; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubShortcut.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubShortcut.java index e88fbeaac0765..a55a5556bafca 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubShortcut.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/PSubShortcut.java @@ -29,12 +29,12 @@ /** * Represents a field load/store shortcut. (Internal only.) */ -final class PSubShortcut extends AStoreable { +public class PSubShortcut extends AStoreable { - private final String value; - private final String type; - private final PainlessMethod getter; - private final PainlessMethod setter; + protected final String value; + protected final String type; + protected final PainlessMethod getter; + protected final PainlessMethod setter; PSubShortcut(Location location, String value, String type, PainlessMethod getter, PainlessMethod setter) { super(location); @@ -46,9 +46,8 @@ final class PSubShortcut extends AStoreable { } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, AStoreable.Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, AStoreable.Input input) { + Output output = new Output(); if (getter != null && (getter.returnType == void.class || !getter.typeParameters.isEmpty())) { throw createError(new IllegalArgumentException( @@ -70,11 +69,6 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, AStoreable.Input input) { throw createError(new IllegalArgumentException("Illegal shortcut on field [" + value + "] for type [" + type + "].")); } - return output; - } - - @Override - DotSubShortcutNode write(ClassNode classNode) { DotSubShortcutNode dotSubShortcutNode = new DotSubShortcutNode(); dotSubShortcutNode.setLocation(location); @@ -82,7 +76,9 @@ DotSubShortcutNode write(ClassNode classNode) { dotSubShortcutNode.setGetter(getter); dotSubShortcutNode.setSetter(setter); - return dotSubShortcutNode; + output.expressionNode = dotSubShortcutNode; + + return output; } @Override @@ -90,11 +86,6 @@ boolean isDefOptimized() { return false; } - @Override - void updateActual(Class actual) { - throw new IllegalArgumentException("Illegal tree structure."); - } - @Override public String toString() { return singleLineToString(prefix, value); diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SBlock.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SBlock.java index 8e6db345e0b30..f662976c2e4b3 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SBlock.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SBlock.java @@ -25,6 +25,7 @@ import org.elasticsearch.painless.ir.ClassNode; import org.elasticsearch.painless.symbol.ScriptRoot; +import java.util.ArrayList; import java.util.Collections; import java.util.List; @@ -33,9 +34,9 @@ /** * Represents a set of statements as a branch of control-flow. */ -public final class SBlock extends AStatement { +public class SBlock extends AStatement { - final List statements; + protected final List statements; public SBlock(Location location, List statements) { super(location); @@ -44,9 +45,8 @@ public SBlock(Location location, List statements) { } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); if (statements == null || statements.isEmpty()) { throw createError(new IllegalArgumentException("A block must contain at least one statement.")); @@ -54,6 +54,8 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { AStatement last = statements.get(statements.size() - 1); + List statementOutputs = new ArrayList<>(statements.size()); + for (AStatement statement : statements) { // Note that we do not need to check after the last statement because // there is no statement that can be unreachable after the last. @@ -65,7 +67,8 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { statementInput.inLoop = input.inLoop; statementInput.lastSource = input.lastSource && statement == last; statementInput.lastLoop = (input.beginLoop || input.lastLoop) && statement == last; - Output statementOutput = statement.analyze(scriptRoot, scope, statementInput); + + Output statementOutput = statement.analyze(classNode, scriptRoot, scope, statementInput); output.methodEscape = statementOutput.methodEscape; output.loopEscape = statementOutput.loopEscape; @@ -73,24 +76,23 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { output.anyContinue |= statementOutput.anyContinue; output.anyBreak |= statementOutput.anyBreak; output.statementCount += statementOutput.statementCount; - } - return output; - } + statementOutputs.add(statementOutput); + } - @Override - BlockNode write(ClassNode classNode) { BlockNode blockNode = new BlockNode(); - for (AStatement statement : statements) { - blockNode.addStatementNode(statement.write(classNode)); + for (Output statementOutput : statementOutputs) { + blockNode.addStatementNode(statementOutput.statementNode); } blockNode.setLocation(location); blockNode.setAllEscape(output.allEscape); blockNode.setStatementCount(output.statementCount); - return blockNode; + output.statementNode = blockNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SBreak.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SBreak.java index 4b036153319ea..55d22a2097a9f 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SBreak.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SBreak.java @@ -28,16 +28,15 @@ /** * Represents a break statement. */ -public final class SBreak extends AStatement { +public class SBreak extends AStatement { public SBreak(Location location) { super(location); } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); if (input.inLoop == false) { throw createError(new IllegalArgumentException("Break statement outside of a loop.")); @@ -48,16 +47,12 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { output.anyBreak = true; output.statementCount = 1; - return output; - } - - @Override - BreakNode write(ClassNode classNode) { BreakNode breakNode = new BreakNode(); - breakNode.setLocation(location); - return breakNode; + output.statementNode = breakNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SCatch.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SCatch.java index f00c0b18a70fc..06dc1a06514fd 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SCatch.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SCatch.java @@ -21,8 +21,10 @@ import org.elasticsearch.painless.Location; import org.elasticsearch.painless.Scope; +import org.elasticsearch.painless.ir.BlockNode; import org.elasticsearch.painless.ir.CatchNode; import org.elasticsearch.painless.ir.ClassNode; +import org.elasticsearch.painless.ir.DeclarationNode; import org.elasticsearch.painless.lookup.PainlessLookupUtility; import org.elasticsearch.painless.symbol.ScriptRoot; @@ -31,11 +33,11 @@ /** * Represents a catch block as part of a try-catch block. */ -public final class SCatch extends AStatement { +public class SCatch extends AStatement { - private final DType baseException; - private final SDeclaration declaration; - private final SBlock block; + protected final DType baseException; + protected final SDeclaration declaration; + protected final SBlock block; public SCatch(Location location, DType baseException, SDeclaration declaration, SBlock block) { super(location); @@ -46,11 +48,10 @@ public SCatch(Location location, DType baseException, SDeclaration declaration, } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); - declaration.analyze(scriptRoot, scope, new Input()); + Output declarationOutput = declaration.analyze(classNode, scriptRoot, scope, new Input()); Class baseType = baseException.resolveType(scriptRoot.getPainlessLookup()).getType(); Class type = scope.getVariable(location, declaration.name).getType(); @@ -61,12 +62,14 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { "to [" + PainlessLookupUtility.typeToCanonicalTypeName(baseType) + "]")); } + Output blockOutput = null; + if (block != null) { Input blockInput = new Input(); blockInput.lastSource = input.lastSource; blockInput.inLoop = input.inLoop; blockInput.lastLoop = input.lastLoop; - Output blockOutput = block.analyze(scriptRoot, scope, blockInput); + blockOutput = block.analyze(classNode, scriptRoot, scope, blockInput); output.methodEscape = blockOutput.methodEscape; output.loopEscape = blockOutput.loopEscape; @@ -76,19 +79,14 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { output.statementCount = blockOutput.statementCount; } - return output; - } - - @Override - CatchNode write(ClassNode classNode) { CatchNode catchNode = new CatchNode(); - - catchNode.setDeclarationNode(declaration.write(classNode)); - catchNode.setBlockNode(block == null ? null : block.write(classNode)); - + catchNode.setDeclarationNode((DeclarationNode)declarationOutput.statementNode); + catchNode.setBlockNode(blockOutput == null ? null : (BlockNode)blockOutput.statementNode); catchNode.setLocation(location); - return catchNode; + output.statementNode = catchNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SClass.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SClass.java index d946b9b1c69b5..1c03994d55d06 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SClass.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SClass.java @@ -22,7 +22,6 @@ import org.elasticsearch.painless.Location; import org.elasticsearch.painless.ScriptClassInfo; import org.elasticsearch.painless.ir.ClassNode; -import org.elasticsearch.painless.ir.StatementNode; import org.elasticsearch.painless.symbol.FunctionTable; import org.elasticsearch.painless.symbol.ScriptRoot; import org.objectweb.asm.util.Printer; @@ -36,37 +35,25 @@ /** * The root of all Painless trees. Contains a series of statements. */ -public final class SClass extends ANode { +public class SClass extends ANode { - private final ScriptClassInfo scriptClassInfo; - private final String name; - private final Printer debugStream; - private final List functions = new ArrayList<>(); - private final List fields = new ArrayList<>(); - - private ScriptRoot scriptRoot; - private final String sourceText; + protected final ScriptClassInfo scriptClassInfo; + protected final String name; + protected final String sourceText; + protected final Printer debugStream; + protected final List functions = new ArrayList<>(); public SClass(ScriptClassInfo scriptClassInfo, String name, String sourceText, Printer debugStream, Location location, List functions) { super(location); this.scriptClassInfo = Objects.requireNonNull(scriptClassInfo); this.name = Objects.requireNonNull(name); + this.sourceText = Objects.requireNonNull(sourceText); this.debugStream = debugStream; this.functions.addAll(Objects.requireNonNull(functions)); - this.sourceText = Objects.requireNonNull(sourceText); - } - - void addFunction(SFunction function) { - functions.add(function); } - void addField(SField field) { - fields.add(field); - } - - public ScriptRoot analyze(ScriptRoot scriptRoot) { - this.scriptRoot = scriptRoot; + public ClassNode writeClass(ScriptRoot scriptRoot) { scriptRoot.addStaticConstant("$NAME", name); scriptRoot.addStaticConstant("$SOURCE", sourceText); @@ -83,31 +70,10 @@ public ScriptRoot analyze(ScriptRoot scriptRoot) { function.name, function.returnType, function.typeParameters, function.isInternal, function.isStatic); } - // copy protection is required because synthetic functions are - // added for lambdas/method references and analysis here is - // only for user-defined functions - List functions = new ArrayList<>(this.functions); - for (SFunction function : functions) { - function.analyze(scriptRoot); - } - - return scriptRoot; - } - - @Override - public StatementNode write(ClassNode classNode) { - throw new UnsupportedOperationException(); - } - - public ClassNode writeClass() { ClassNode classNode = new ClassNode(); - for (SField field : fields) { - classNode.addFieldNode(field.write(classNode)); - } - for (SFunction function : functions) { - classNode.addFunctionNode(function.write(classNode)); + classNode.addFunctionNode(function.writeFunction(classNode, scriptRoot)); } classNode.setLocation(location); diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SContinue.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SContinue.java index 1f5c752f8ccef..830455274f9e3 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SContinue.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SContinue.java @@ -28,16 +28,15 @@ /** * Represents a continue statement. */ -public final class SContinue extends AStatement { +public class SContinue extends AStatement { public SContinue(Location location) { super(location); } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); if (input.inLoop == false) { throw createError(new IllegalArgumentException("Continue statement outside of a loop.")); @@ -51,16 +50,12 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { output.anyContinue = true; output.statementCount = 1; - return output; - } - - @Override - ContinueNode write(ClassNode classNode) { ContinueNode continueNode = new ContinueNode(); - continueNode.setLocation(location); - return continueNode; + output.statementNode = continueNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SDeclBlock.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SDeclBlock.java index 47bbe0d123ecb..4bbfcf6d1d505 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SDeclBlock.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SDeclBlock.java @@ -23,8 +23,10 @@ import org.elasticsearch.painless.Scope; import org.elasticsearch.painless.ir.ClassNode; import org.elasticsearch.painless.ir.DeclarationBlockNode; +import org.elasticsearch.painless.ir.DeclarationNode; import org.elasticsearch.painless.symbol.ScriptRoot; +import java.util.ArrayList; import java.util.Collections; import java.util.List; @@ -33,9 +35,9 @@ /** * Represents a series of declarations. */ -public final class SDeclBlock extends AStatement { +public class SDeclBlock extends AStatement { - private final List declarations; + protected final List declarations; public SDeclBlock(Location location, List declarations) { super(location); @@ -44,30 +46,28 @@ public SDeclBlock(Location location, List declarations) { } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); + + List declarationOutputs = new ArrayList<>(declarations.size()); for (SDeclaration declaration : declarations) { - declaration.analyze(scriptRoot, scope, new Input()); + declarationOutputs.add(declaration.analyze(classNode, scriptRoot, scope, new Input())); } output.statementCount = declarations.size(); - return output; - } - - @Override - DeclarationBlockNode write(ClassNode classNode) { DeclarationBlockNode declarationBlockNode = new DeclarationBlockNode(); - for (SDeclaration declaration : declarations) { - declarationBlockNode.addDeclarationNode(declaration.write(classNode)); + for (Output declarationOutput : declarationOutputs) { + declarationBlockNode.addDeclarationNode((DeclarationNode)declarationOutput.statementNode); } declarationBlockNode.setLocation(location); - return declarationBlockNode; + output.statementNode = declarationBlockNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SDeclaration.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SDeclaration.java index 2a28b4d029558..22b71eb931a79 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SDeclaration.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SDeclaration.java @@ -23,7 +23,6 @@ import org.elasticsearch.painless.Scope; import org.elasticsearch.painless.ir.ClassNode; import org.elasticsearch.painless.ir.DeclarationNode; -import org.elasticsearch.painless.node.AExpression.Input; import org.elasticsearch.painless.symbol.ScriptRoot; import java.util.Objects; @@ -31,12 +30,12 @@ /** * Represents a single variable declaration. */ -public final class SDeclaration extends AStatement { +public class SDeclaration extends AStatement { - private DType type; + protected final DType type; protected final String name; protected final boolean requiresDefault; - private AExpression expression; + protected final AExpression expression; public SDeclaration(Location location, DType type, String name, boolean requiresDefault, AExpression expression) { super(location); @@ -48,37 +47,32 @@ public SDeclaration(Location location, DType type, String name, boolean requires } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); DResolvedType resolvedType = type.resolveType(scriptRoot.getPainlessLookup()); - type = resolvedType; + + AExpression.Output expressionOutput = null; if (expression != null) { AExpression.Input expressionInput = new AExpression.Input(); expressionInput.expected = resolvedType.getType(); - expression.analyze(scriptRoot, scope, expressionInput); - expression.cast(); + expressionOutput = expression.analyze(classNode, scriptRoot, scope, expressionInput); + expression.cast(expressionInput, expressionOutput); } scope.defineVariable(location, resolvedType.getType(), name, false); - return output; - } - - @Override - DeclarationNode write(ClassNode classNode) { DeclarationNode declarationNode = new DeclarationNode(); - - declarationNode.setExpressionNode(expression == null ? null : expression.cast(expression.write(classNode))); - + declarationNode.setExpressionNode(expression == null ? null : expression.cast(expressionOutput)); declarationNode.setLocation(location); - declarationNode.setDeclarationType(((DResolvedType)type).getType()); + declarationNode.setDeclarationType(resolvedType.getType()); declarationNode.setName(name); declarationNode.setRequiresDefault(requiresDefault); - return declarationNode; + output.statementNode = declarationNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SDo.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SDo.java index 8d31f325a049f..f5f4b0636a370 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SDo.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SDo.java @@ -21,6 +21,7 @@ import org.elasticsearch.painless.Location; import org.elasticsearch.painless.Scope; +import org.elasticsearch.painless.ir.BlockNode; import org.elasticsearch.painless.ir.ClassNode; import org.elasticsearch.painless.ir.DoWhileLoopNode; import org.elasticsearch.painless.symbol.ScriptRoot; @@ -30,12 +31,10 @@ /** * Represents a do-while loop. */ -public final class SDo extends AStatement { +public class SDo extends AStatement { - private final SBlock block; - private AExpression condition; - - private boolean continuous = false; + protected final SBlock block; + protected final AExpression condition; public SDo(Location location, SBlock block, AExpression condition) { super(location); @@ -45,10 +44,8 @@ public SDo(Location location, SBlock block, AExpression condition) { } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); - + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); scope = scope.newLocalScope(); if (block == null) { @@ -58,7 +55,7 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { Input blockInput = new Input(); blockInput.beginLoop = true; blockInput.inLoop = true; - Output blockOutput = block.analyze(scriptRoot, scope, blockInput); + Output blockOutput = block.analyze(classNode, scriptRoot, scope, blockInput); if (blockOutput.loopEscape && blockOutput.anyContinue == false) { throw createError(new IllegalArgumentException("Extraneous do while loop.")); @@ -66,8 +63,10 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { AExpression.Input conditionInput = new AExpression.Input(); conditionInput.expected = boolean.class; - condition.analyze(scriptRoot, scope, conditionInput); - condition.cast(); + AExpression.Output conditionOutput = condition.analyze(classNode, scriptRoot, scope, conditionInput); + condition.cast(conditionInput, conditionOutput); + + boolean continuous = false; if (condition instanceof EBoolean) { continuous = ((EBoolean)condition).constant; @@ -84,20 +83,15 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { output.statementCount = 1; - return output; - } - - @Override - DoWhileLoopNode write(ClassNode classNode) { DoWhileLoopNode doWhileLoopNode = new DoWhileLoopNode(); - - doWhileLoopNode.setConditionNode(condition.cast(condition.write(classNode))); - doWhileLoopNode.setBlockNode(block.write(classNode)); - + doWhileLoopNode.setConditionNode(condition.cast(conditionOutput)); + doWhileLoopNode.setBlockNode((BlockNode)blockOutput.statementNode); doWhileLoopNode.setLocation(location); doWhileLoopNode.setContinuous(continuous); - return doWhileLoopNode; + output.statementNode = doWhileLoopNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SEach.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SEach.java index b5c439233b64d..ac8741cc551ec 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SEach.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SEach.java @@ -36,12 +36,10 @@ */ public class SEach extends AStatement { - private final String type; - private final String name; - private AExpression expression; - private final SBlock block; - - private AStatement sub = null; + protected final String type; + protected final String name; + protected final AExpression expression; + protected final SBlock block; public SEach(Location location, String type, String name, AExpression expression, SBlock block) { super(location); @@ -53,13 +51,14 @@ public SEach(Location location, String type, String name, AExpression expression } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); - AExpression.Output expressionOutput = expression.analyze(scriptRoot, scope, new AExpression.Input()); - expression.input.expected = expressionOutput.actual; - expression.cast(); + AExpression.Input expressionInput = new AExpression.Input(); + AExpression.Output expressionOutput = expression.analyze(classNode, scriptRoot, scope, expressionInput); + // TODO: no need to cast here + expressionInput.expected = expressionOutput.actual; + expression.cast(expressionInput, expressionOutput); Class clazz = scriptRoot.getPainlessLookup().canonicalTypeNameToType(this.type); @@ -70,17 +69,6 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { scope = scope.newLocalScope(); Variable variable = scope.defineVariable(location, clazz, name, true); - if (expressionOutput.actual.isArray()) { - sub = new SSubEachArray(location, variable, expression, block); - } else if (expressionOutput.actual == def.class || Iterable.class.isAssignableFrom(expressionOutput.actual)) { - sub = new SSubEachIterable(location, variable, expression, block); - } else { - throw createError(new IllegalArgumentException("Illegal for each type " + - "[" + PainlessLookupUtility.typeToCanonicalTypeName(expressionOutput.actual) + "].")); - } - - sub.analyze(scriptRoot, scope, input); - if (block == null) { throw createError(new IllegalArgumentException("Extraneous for each loop.")); } @@ -88,27 +76,36 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { Input blockInput = new Input(); blockInput.beginLoop = true; blockInput.inLoop = true; - Output blockOutput = block.analyze(scriptRoot, scope, blockInput); + Output blockOutput = block.analyze(classNode, scriptRoot, scope, blockInput); blockOutput.statementCount = Math.max(1, blockOutput.statementCount); if (blockOutput.loopEscape && blockOutput.anyContinue == false) { throw createError(new IllegalArgumentException("Extraneous for loop.")); } - output.statementCount = 1; + AStatement sub; - return output; - } + if (expressionOutput.actual.isArray()) { + sub = new SSubEachArray(location, variable, expressionOutput, blockOutput); + } else if (expressionOutput.actual == def.class || Iterable.class.isAssignableFrom(expressionOutput.actual)) { + sub = new SSubEachIterable(location, variable, expressionOutput, blockOutput); + } else { + throw createError(new IllegalArgumentException("Illegal for each type " + + "[" + PainlessLookupUtility.typeToCanonicalTypeName(expressionOutput.actual) + "].")); + } - @Override - ForEachLoopNode write(ClassNode classNode) { - ForEachLoopNode forEachLoopNode = new ForEachLoopNode(); + Output subOutput = sub.analyze(classNode, scriptRoot, scope, input); - forEachLoopNode.setConditionNode((ConditionNode)sub.write(classNode)); + output.statementCount = 1; + ForEachLoopNode forEachLoopNode = new ForEachLoopNode(); + forEachLoopNode.setConditionNode((ConditionNode)subOutput.statementNode); forEachLoopNode.setLocation(location); - return forEachLoopNode; + output.statementNode = forEachLoopNode; + + return output; + } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SExpression.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SExpression.java index f04cefa622acd..ace1be4b96736 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SExpression.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SExpression.java @@ -25,7 +25,6 @@ import org.elasticsearch.painless.ir.ExpressionNode; import org.elasticsearch.painless.ir.ReturnNode; import org.elasticsearch.painless.ir.StatementExpressionNode; -import org.elasticsearch.painless.ir.StatementNode; import org.elasticsearch.painless.symbol.ScriptRoot; import java.util.Objects; @@ -33,9 +32,9 @@ /** * Represents the top-level node for an expression as a statement. */ -public final class SExpression extends AStatement { +public class SExpression extends AStatement { - private AExpression expression; + protected final AExpression expression; public SExpression(Location location, AExpression expression) { super(location); @@ -44,15 +43,13 @@ public SExpression(Location location, AExpression expression) { } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { Class rtnType = scope.getReturnType(); boolean isVoid = rtnType == void.class; AExpression.Input expressionInput = new AExpression.Input(); expressionInput.read = input.lastSource && !isVoid; - AExpression.Output expressionOutput = expression.analyze(scriptRoot, scope, expressionInput); + AExpression.Output expressionOutput = expression.analyze(classNode, scriptRoot, scope, expressionInput); if ((input.lastSource == false || isVoid) && expressionOutput.statement == false) { throw createError(new IllegalArgumentException("Not a statement.")); @@ -60,40 +57,33 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { boolean rtn = input.lastSource && isVoid == false && expressionOutput.actual != void.class; - expression.input.expected = rtn ? rtnType : expressionOutput.actual; - expression.input.internal = rtn; - expression.cast(); + expressionInput.expected = rtn ? rtnType : expressionOutput.actual; + expressionInput.internal = rtn; + expression.cast(expressionInput, expressionOutput); - output = new Output(); + Output output = new Output(); output.methodEscape = rtn; output.loopEscape = rtn; output.allEscape = rtn; output.statementCount = 1; - return output; - } - - @Override - StatementNode write(ClassNode classNode) { - ExpressionNode expressionNode = expression.cast(expression.write(classNode)); + ExpressionNode expressionNode = expression.cast(expressionOutput); if (output.methodEscape) { ReturnNode returnNode = new ReturnNode(); - returnNode.setExpressionNode(expressionNode); - returnNode.setLocation(location); - return returnNode; + output.statementNode = returnNode; } else { StatementExpressionNode statementExpressionNode = new StatementExpressionNode(); - statementExpressionNode.setExpressionNode(expressionNode); - statementExpressionNode.setLocation(location); - return statementExpressionNode; + output.statementNode = statementExpressionNode; } + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SField.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SField.java deleted file mode 100644 index 944802fb7ac39..0000000000000 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SField.java +++ /dev/null @@ -1,70 +0,0 @@ -/* - * Licensed to Elasticsearch under one or more contributor - * license agreements. See the NOTICE file distributed with - * this work for additional information regarding copyright - * ownership. Elasticsearch licenses this file to you under - * the Apache License, Version 2.0 (the "License"); you may - * not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, - * software distributed under the License is distributed on an - * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY - * KIND, either express or implied. See the License for the - * specific language governing permissions and limitations - * under the License. - */ - -package org.elasticsearch.painless.node; - -import org.elasticsearch.painless.Location; -import org.elasticsearch.painless.ir.ClassNode; -import org.elasticsearch.painless.ir.FieldNode; - -/** - * Represents a member field for its parent class (internal only). - */ -public class SField extends ANode { - - private final int modifiers; - private final String name; - private final Class type; - - /** - * Standard constructor. - * @param location original location in the source - * @param modifiers java modifiers for the field - * @param name name of the field - * @param type type of the field - */ - public SField(Location location, int modifiers, String name, Class type) { - super(location); - - this.modifiers = modifiers; - this.name = name; - this.type = type; - } - - public String getName() { - return name; - } - - @Override - FieldNode write(ClassNode classNode) { - FieldNode fieldNode = new FieldNode(); - - fieldNode.setLocation(location); - fieldNode.setModifiers(modifiers); - fieldNode.setName(name); - fieldNode.setFieldType(type); - - return fieldNode; - } - - @Override - public String toString() { - return singleLineToString(name, type); - } -} diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SFor.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SFor.java index 241c50b8fe2b9..ce43e9a434f1a 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SFor.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SFor.java @@ -21,8 +21,8 @@ import org.elasticsearch.painless.Location; import org.elasticsearch.painless.Scope; +import org.elasticsearch.painless.ir.BlockNode; import org.elasticsearch.painless.ir.ClassNode; -import org.elasticsearch.painless.ir.ExpressionNode; import org.elasticsearch.painless.ir.ForLoopNode; import org.elasticsearch.painless.symbol.ScriptRoot; @@ -33,14 +33,12 @@ /** * Represents a for loop. */ -public final class SFor extends AStatement { +public class SFor extends AStatement { - private ANode initializer; - private AExpression condition; - private AExpression afterthought; - private final SBlock block; - - private boolean continuous = false; + protected final ANode initializer; + protected final AExpression condition; + protected final AExpression afterthought; + protected final SBlock block; public SFor(Location location, ANode initializer, AExpression condition, AExpression afterthought, SBlock block) { super(location); @@ -52,37 +50,42 @@ public SFor(Location location, ANode initializer, AExpression condition, AExpres } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { scope = scope.newLocalScope(); + Output initializerStatementOutput = null; + AExpression.Output initializerExpressionOutput = null; + if (initializer != null) { if (initializer instanceof SDeclBlock) { - ((SDeclBlock)initializer).analyze(scriptRoot, scope, new Input()); + initializerStatementOutput = ((SDeclBlock)initializer).analyze(classNode, scriptRoot, scope, new Input()); } else if (initializer instanceof AExpression) { AExpression initializer = (AExpression)this.initializer; AExpression.Input initializerInput = new AExpression.Input(); initializerInput.read = false; - AExpression.Output initializerOutput = initializer.analyze(scriptRoot, scope, initializerInput); + initializerExpressionOutput = initializer.analyze(classNode, scriptRoot, scope, initializerInput); - if (initializerOutput.statement == false) { + if (initializerExpressionOutput.statement == false) { throw createError(new IllegalArgumentException("Not a statement.")); } - initializer.input.expected = initializerOutput.actual; - initializer.cast(); + initializerInput.expected = initializerExpressionOutput.actual; + initializer.cast(initializerInput, initializerExpressionOutput); } else { throw createError(new IllegalStateException("Illegal tree structure.")); } } + boolean continuous = false; + + AExpression.Output conditionOutput = null; + if (condition != null) { AExpression.Input conditionInput = new AExpression.Input(); conditionInput.expected = boolean.class; - condition.analyze(scriptRoot, scope, conditionInput); - condition.cast(); + conditionOutput = condition.analyze(classNode, scriptRoot, scope, conditionInput); + condition.cast(conditionInput, conditionOutput); if (condition instanceof EBoolean) { continuous = ((EBoolean)condition).constant; @@ -99,27 +102,30 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { continuous = true; } + AExpression.Output afterthoughtOutput = null; + if (afterthought != null) { AExpression.Input afterthoughtInput = new AExpression.Input(); afterthoughtInput.read = false; - AExpression.Output afterthoughtOutput = afterthought.analyze(scriptRoot, scope, afterthoughtInput); + afterthoughtOutput = afterthought.analyze(classNode, scriptRoot, scope, afterthoughtInput); if (afterthoughtOutput.statement == false) { throw createError(new IllegalArgumentException("Not a statement.")); } - afterthought.input.expected = afterthoughtOutput.actual; - afterthought.cast(); + afterthoughtInput.expected = afterthoughtOutput.actual; + afterthought.cast(afterthoughtInput, afterthoughtOutput); } - output = new Output(); + Output output = new Output(); + Output blockOutput = null; if (block != null) { Input blockInput = new Input(); blockInput.beginLoop = true; blockInput.inLoop = true; - Output blockOutput = block.analyze(scriptRoot, scope, blockInput); + blockOutput = block.analyze(classNode, scriptRoot, scope, blockInput); if (blockOutput.loopEscape && blockOutput.anyContinue == false) { throw createError(new IllegalArgumentException("Extraneous for loop.")); @@ -135,24 +141,19 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { output.statementCount = 1; - return output; - } - - @Override - ForLoopNode write(ClassNode classNode) { ForLoopNode forLoopNode = new ForLoopNode(); - forLoopNode.setInitialzerNode(initializer == null ? null : initializer instanceof AExpression ? - ((AExpression)initializer).cast((ExpressionNode)initializer.write(classNode)) : - initializer.write(classNode)); - forLoopNode.setConditionNode(condition == null ? null : condition.cast(condition.write(classNode))); - forLoopNode.setAfterthoughtNode(afterthought == null ? null : afterthought.cast(afterthought.write(classNode))); - forLoopNode.setBlockNode(block == null ? null : block.write(classNode)); - + ((AExpression)initializer).cast(initializerExpressionOutput) : + initializerStatementOutput.statementNode); + forLoopNode.setConditionNode(condition == null ? null : condition.cast(conditionOutput)); + forLoopNode.setAfterthoughtNode(afterthought == null ? null : afterthought.cast(afterthoughtOutput)); + forLoopNode.setBlockNode(blockOutput == null ? null : (BlockNode)blockOutput.statementNode); forLoopNode.setLocation(location); forLoopNode.setContinuous(continuous); - return forLoopNode; + output.statementNode = forLoopNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SFunction.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SFunction.java index 96bf040099a17..fad71982b4772 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SFunction.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SFunction.java @@ -46,32 +46,28 @@ /** * Represents a user-defined function. */ -public final class SFunction extends ANode { +public class SFunction extends ANode { - private final String rtnTypeStr; - public final String name; - private final List paramTypeStrs; - private final List paramNameStrs; - private final SBlock block; - public final boolean isInternal; - public final boolean isStatic; - public final boolean synthetic; + protected final String rtnTypeStr; + protected final String name; + protected final List paramTypeStrs; + protected final List paramNameStrs; + protected final SBlock block; + protected final boolean isInternal; + protected final boolean isStatic; + protected final boolean synthetic; /** * If set to {@code true} default return values are inserted if * not all paths return a value. */ - public final boolean isAutoReturnEnabled; + protected final boolean isAutoReturnEnabled; - private int maxLoopCounter; + protected Class returnType; + protected List> typeParameters; + protected MethodType methodType; - Class returnType; - List> typeParameters; - MethodType methodType; - - org.objectweb.asm.commons.Method method; - - private boolean methodEscape; + protected org.objectweb.asm.commons.Method method; public SFunction(Location location, String rtnType, String name, List paramTypes, List paramNames, @@ -89,6 +85,7 @@ public SFunction(Location location, String rtnType, String name, this.isAutoReturnEnabled = isAutoReturnEnabled; } + // TODO: do this in class on add to remove need for mutable state void generateSignature(PainlessLookup painlessLookup) { returnType = painlessLookup.canonicalTypeNameToType(rtnTypeStr); @@ -121,7 +118,7 @@ void generateSignature(PainlessLookup painlessLookup) { PainlessLookupUtility.typeToJavaType(returnType), paramClasses).toMethodDescriptorString()); } - void analyze(ScriptRoot scriptRoot) { + FunctionNode writeFunction(ClassNode classNode, ScriptRoot scriptRoot) { FunctionScope functionScope = newFunctionScope(returnType); for (int index = 0; index < typeParameters.size(); ++index) { @@ -130,7 +127,7 @@ void analyze(ScriptRoot scriptRoot) { functionScope.defineVariable(location, typeParameter, parameterName, false); } - maxLoopCounter = scriptRoot.getCompilerSettings().getMaxLoopCounter(); + int maxLoopCounter = scriptRoot.getCompilerSettings().getMaxLoopCounter(); if (block.statements.isEmpty()) { throw createError(new IllegalArgumentException("Cannot generate an empty function [" + name + "].")); @@ -138,8 +135,8 @@ void analyze(ScriptRoot scriptRoot) { Input blockInput = new Input(); blockInput.lastSource = true; - Output blockOutput = block.analyze(scriptRoot, functionScope.newLocalScope(), blockInput); - methodEscape = blockOutput.methodEscape; + Output blockOutput = block.analyze(classNode, scriptRoot, functionScope.newLocalScope(), blockInput); + boolean methodEscape = blockOutput.methodEscape; if (methodEscape == false && isAutoReturnEnabled == false && returnType != void.class) { throw createError(new IllegalArgumentException("not all paths provide a return value " + @@ -152,11 +149,8 @@ void analyze(ScriptRoot scriptRoot) { scriptRoot.setUsedVariables(functionScope.getUsedVariables()); } // TODO: end - } - @Override - public FunctionNode write(ClassNode classNode) { - BlockNode blockNode = block.write(classNode); + BlockNode blockNode = (BlockNode)blockOutput.statementNode; if (methodEscape == false) { ExpressionNode expressionNode; diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SIf.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SIf.java index a7d738670ec6f..44b6e0d6105c4 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SIf.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SIf.java @@ -21,6 +21,7 @@ import org.elasticsearch.painless.Location; import org.elasticsearch.painless.Scope; +import org.elasticsearch.painless.ir.BlockNode; import org.elasticsearch.painless.ir.ClassNode; import org.elasticsearch.painless.ir.IfNode; import org.elasticsearch.painless.symbol.ScriptRoot; @@ -30,10 +31,10 @@ /** * Represents an if block. */ -public final class SIf extends AStatement { +public class SIf extends AStatement { - AExpression condition; - final SBlock ifblock; + protected final AExpression condition; + protected final SBlock ifblock; public SIf(Location location, AExpression condition, SBlock ifblock) { super(location); @@ -43,14 +44,13 @@ public SIf(Location location, AExpression condition, SBlock ifblock) { } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); AExpression.Input conditionInput = new AExpression.Input(); conditionInput.expected = boolean.class; - condition.analyze(scriptRoot, scope, conditionInput); - condition.cast(); + AExpression.Output conditionOutput = condition.analyze(classNode, scriptRoot, scope, conditionInput); + condition.cast(conditionInput, conditionOutput); if (condition instanceof EBoolean) { throw createError(new IllegalArgumentException("Extraneous if statement.")); @@ -65,25 +65,20 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { ifblockInput.inLoop = input.inLoop; ifblockInput.lastLoop = input.lastLoop; - Output ifblockOutput = ifblock.analyze(scriptRoot, scope.newLocalScope(), ifblockInput); + Output ifblockOutput = ifblock.analyze(classNode, scriptRoot, scope.newLocalScope(), ifblockInput); output.anyContinue = ifblockOutput.anyContinue; output.anyBreak = ifblockOutput.anyBreak; output.statementCount = ifblockOutput.statementCount; - return output; - } - - @Override - IfNode write(ClassNode classNode) { IfNode ifNode = new IfNode(); - - ifNode.setConditionNode(condition.cast(condition.write(classNode))); - ifNode.setBlockNode(ifblock.write(classNode)); - + ifNode.setConditionNode(condition.cast(conditionOutput)); + ifNode.setBlockNode((BlockNode)ifblockOutput.statementNode); ifNode.setLocation(location); - return ifNode; + output.statementNode = ifNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SIfElse.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SIfElse.java index f433db860ccfb..6ae366ad8428a 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SIfElse.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SIfElse.java @@ -21,6 +21,7 @@ import org.elasticsearch.painless.Location; import org.elasticsearch.painless.Scope; +import org.elasticsearch.painless.ir.BlockNode; import org.elasticsearch.painless.ir.ClassNode; import org.elasticsearch.painless.ir.IfElseNode; import org.elasticsearch.painless.symbol.ScriptRoot; @@ -33,11 +34,11 @@ /** * Represents an if/else block. */ -public final class SIfElse extends AStatement { +public class SIfElse extends AStatement { - private AExpression condition; - private final SBlock ifblock; - private final SBlock elseblock; + protected final AExpression condition; + protected final SBlock ifblock; + protected final SBlock elseblock; public SIfElse(Location location, AExpression condition, SBlock ifblock, SBlock elseblock) { super(location); @@ -48,14 +49,13 @@ public SIfElse(Location location, AExpression condition, SBlock ifblock, SBlock } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); AExpression.Input conditionInput = new AExpression.Input(); conditionInput.expected = boolean.class; - condition.analyze(scriptRoot, scope, conditionInput); - condition.cast(); + AExpression.Output conditionOutput = condition.analyze(classNode, scriptRoot, scope, conditionInput); + condition.cast(conditionInput, conditionOutput); if (condition instanceof EBoolean) { throw createError(new IllegalArgumentException("Extraneous if statement.")); @@ -70,7 +70,7 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { ifblockInput.inLoop = input.inLoop; ifblockInput.lastLoop = input.lastLoop; - Output ifblockOutput = ifblock.analyze(scriptRoot, scope.newLocalScope(), ifblockInput); + Output ifblockOutput = ifblock.analyze(classNode, scriptRoot, scope.newLocalScope(), ifblockInput); output.anyContinue = ifblockOutput.anyContinue; output.anyBreak = ifblockOutput.anyBreak; @@ -85,7 +85,7 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { elseblockInput.inLoop = input.inLoop; elseblockInput.lastLoop = input.lastLoop; - Output elseblockOutput = elseblock.analyze(scriptRoot, scope.newLocalScope(), elseblockInput); + Output elseblockOutput = elseblock.analyze(classNode, scriptRoot, scope.newLocalScope(), elseblockInput); output.methodEscape = ifblockOutput.methodEscape && elseblockOutput.methodEscape; output.loopEscape = ifblockOutput.loopEscape && elseblockOutput.loopEscape; @@ -94,20 +94,15 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { output.anyBreak |= elseblockOutput.anyBreak; output.statementCount = Math.max(ifblockOutput.statementCount, elseblockOutput.statementCount); - return output; - } - - @Override - IfElseNode write(ClassNode classNode) { IfElseNode ifElseNode = new IfElseNode(); - - ifElseNode.setConditionNode(condition.cast(condition.write(classNode))); - ifElseNode.setBlockNode(ifblock.write(classNode)); - ifElseNode.setElseBlockNode(elseblock.write(classNode)); - + ifElseNode.setConditionNode(condition.cast(conditionOutput)); + ifElseNode.setBlockNode((BlockNode)ifblockOutput.statementNode); + ifElseNode.setElseBlockNode((BlockNode)elseblockOutput.statementNode); ifElseNode.setLocation(location); - return ifElseNode; + output.statementNode = ifElseNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SReturn.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SReturn.java index 23a5d4183edbc..6ab6c6590e350 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SReturn.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SReturn.java @@ -29,9 +29,9 @@ /** * Represents a return statement. */ -public final class SReturn extends AStatement { +public class SReturn extends AStatement { - private AExpression expression; + protected final AExpression expression; public SReturn(Location location, AExpression expression) { super(location); @@ -40,9 +40,10 @@ public SReturn(Location location, AExpression expression) { } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); + + AExpression.Output expressionOutput = null; if (expression == null) { if (scope.getReturnType() != void.class) { @@ -54,8 +55,8 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { AExpression.Input expressionInput = new AExpression.Input(); expressionInput.expected = scope.getReturnType(); expressionInput.internal = true; - expression.analyze(scriptRoot, scope, expressionInput); - expression.cast(); + expressionOutput = expression.analyze(classNode, scriptRoot, scope, expressionInput); + expression.cast(expressionInput, expressionOutput); } output.methodEscape = true; @@ -64,18 +65,13 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { output.statementCount = 1; - return output; - } - - @Override - ReturnNode write(ClassNode classNode) { ReturnNode returnNode = new ReturnNode(); - - returnNode.setExpressionNode(expression == null ? null : expression.cast(expression.write(classNode))); - + returnNode.setExpressionNode(expression == null ? null : expression.cast(expressionOutput)); returnNode.setLocation(location); - return returnNode; + output.statementNode = returnNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SSubEachArray.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SSubEachArray.java index 5e1846457d515..fef81dfed6968 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SSubEachArray.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SSubEachArray.java @@ -23,6 +23,7 @@ import org.elasticsearch.painless.Location; import org.elasticsearch.painless.Scope; import org.elasticsearch.painless.Scope.Variable; +import org.elasticsearch.painless.ir.BlockNode; import org.elasticsearch.painless.ir.ClassNode; import org.elasticsearch.painless.ir.ForEachSubArrayNode; import org.elasticsearch.painless.lookup.PainlessCast; @@ -33,46 +34,34 @@ /** * Represents a for-each loop for arrays. */ -final class SSubEachArray extends AStatement { - private final Variable variable; - private AExpression expression; - private final SBlock block; +public class SSubEachArray extends AStatement { - private PainlessCast cast = null; - private Variable array = null; - private Variable index = null; - private Class indexed = null; + protected final Variable variable; + protected final AExpression.Output expressionOutput; + protected final Output blockOutput; - SSubEachArray(Location location, Variable variable, AExpression expression, SBlock block) { + SSubEachArray(Location location, Variable variable, AExpression.Output expressionOutput, Output blockOutput) { super(location); this.variable = Objects.requireNonNull(variable); - this.expression = Objects.requireNonNull(expression); - this.block = block; + this.expressionOutput = Objects.requireNonNull(expressionOutput); + this.blockOutput = blockOutput; } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); // We must store the array and index as variables for securing slots on the stack, and // also add the location offset to make the names unique in case of nested for each loops. - array = scope.defineInternalVariable(location, expression.output.actual, "array" + location.getOffset(), true); - index = scope.defineInternalVariable(location, int.class, "index" + location.getOffset(), true); - indexed = expression.output.actual.getComponentType(); - cast = AnalyzerCaster.getLegalCast(location, indexed, variable.getType(), true, true); + Variable array = scope.defineVariable(location, expressionOutput.actual, "#array" + location.getOffset(), true); + Variable index = scope.defineVariable(location, int.class, "#index" + location.getOffset(), true); + Class indexed = expressionOutput.actual.getComponentType(); + PainlessCast cast = AnalyzerCaster.getLegalCast(location, indexed, variable.getType(), true, true); - return output; - } - - @Override - ForEachSubArrayNode write(ClassNode classNode) { ForEachSubArrayNode forEachSubArrayNode = new ForEachSubArrayNode(); - - forEachSubArrayNode.setConditionNode(expression.write(classNode)); - forEachSubArrayNode.setBlockNode(block.write(classNode)); - + forEachSubArrayNode.setConditionNode(expressionOutput.expressionNode); + forEachSubArrayNode.setBlockNode((BlockNode)blockOutput.statementNode); forEachSubArrayNode.setLocation(location); forEachSubArrayNode.setVariableType(variable.getType()); forEachSubArrayNode.setVariableName(variable.getName()); @@ -84,11 +73,14 @@ ForEachSubArrayNode write(ClassNode classNode) { forEachSubArrayNode.setIndexedType(indexed); forEachSubArrayNode.setContinuous(false); - return forEachSubArrayNode; + output.statementNode = forEachSubArrayNode; + + return output; } @Override public String toString() { - return singleLineToString(variable.getCanonicalTypeName(), variable.getName(), expression, block); + //return singleLineToString(variable.getCanonicalTypeName(), variable.getName(), expression, block); + return null; } } diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SSubEachIterable.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SSubEachIterable.java index 4198452ff8e79..83561431db1b1 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SSubEachIterable.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SSubEachIterable.java @@ -23,6 +23,7 @@ import org.elasticsearch.painless.Location; import org.elasticsearch.painless.Scope; import org.elasticsearch.painless.Scope.Variable; +import org.elasticsearch.painless.ir.BlockNode; import org.elasticsearch.painless.ir.ClassNode; import org.elasticsearch.painless.ir.ForEachSubIterableNode; import org.elasticsearch.painless.lookup.PainlessCast; @@ -38,56 +39,46 @@ /** * Represents a for-each loop for iterables. */ -final class SSubEachIterable extends AStatement { +public class SSubEachIterable extends AStatement { - private AExpression expression; - private final SBlock block; - private final Variable variable; + protected final Variable variable; + protected final AExpression.Output expressionOutput; + protected final Output blockOutput; - private PainlessCast cast = null; - private Variable iterator = null; - private PainlessMethod method = null; - - SSubEachIterable(Location location, Variable variable, AExpression expression, SBlock block) { + SSubEachIterable(Location location, Variable variable, AExpression.Output expressionOutput, Output blockOutput) { super(location); this.variable = Objects.requireNonNull(variable); - this.expression = Objects.requireNonNull(expression); - this.block = block; + this.expressionOutput = Objects.requireNonNull(expressionOutput); + this.blockOutput = blockOutput; } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); // We must store the iterator as a variable for securing a slot on the stack, and // also add the location offset to make the name unique in case of nested for each loops. - iterator = scope.defineInternalVariable(location, Iterator.class, "itr" + location.getOffset(), true); + Variable iterator = scope.defineVariable(location, Iterator.class, "#itr" + location.getOffset(), true); + + PainlessMethod method; - if (expression.output.actual == def.class) { + if (expressionOutput.actual == def.class) { method = null; } else { - method = scriptRoot.getPainlessLookup().lookupPainlessMethod(expression.output.actual, false, "iterator", 0); + method = scriptRoot.getPainlessLookup().lookupPainlessMethod(expressionOutput.actual, false, "iterator", 0); if (method == null) { throw createError(new IllegalArgumentException( - "method [" + typeToCanonicalTypeName(expression.output.actual) + ", iterator/0] not found")); + "method [" + typeToCanonicalTypeName(expressionOutput.actual) + ", iterator/0] not found")); } } - cast = AnalyzerCaster.getLegalCast(location, def.class, variable.getType(), true, true); - - return output; - } + PainlessCast cast = AnalyzerCaster.getLegalCast(location, def.class, variable.getType(), true, true); - @Override - ForEachSubIterableNode write(ClassNode classNode) { ForEachSubIterableNode forEachSubIterableNode = new ForEachSubIterableNode(); - - forEachSubIterableNode.setConditionNode(expression.write(classNode)); - forEachSubIterableNode.setBlockNode(block.write(classNode)); - + forEachSubIterableNode.setConditionNode(expressionOutput.expressionNode); + forEachSubIterableNode.setBlockNode((BlockNode)blockOutput.statementNode); forEachSubIterableNode.setLocation(location); forEachSubIterableNode.setVariableType(variable.getType()); forEachSubIterableNode.setVariableName(variable.getName()); @@ -97,11 +88,14 @@ ForEachSubIterableNode write(ClassNode classNode) { forEachSubIterableNode.setMethod(method); forEachSubIterableNode.setContinuous(false); - return forEachSubIterableNode; + output.statementNode = forEachSubIterableNode; + + return output; } @Override public String toString() { - return singleLineToString(variable.getCanonicalTypeName(), variable.getName(), expression, block); + //return singleLineToString(variable.getCanonicalTypeName(), variable.getName(), expression, block); + return null; } } diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SThrow.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SThrow.java index 86b40fafa378b..3fff9615b12d9 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SThrow.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SThrow.java @@ -30,9 +30,9 @@ /** * Represents a throw statement. */ -public final class SThrow extends AStatement { +public class SThrow extends AStatement { - private AExpression expression; + protected final AExpression expression; public SThrow(Location location, AExpression expression) { super(location); @@ -41,32 +41,26 @@ public SThrow(Location location, AExpression expression) { } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); AExpression.Input expressionInput = new AExpression.Input(); expressionInput.expected = Exception.class; - expression.analyze(scriptRoot, scope, expressionInput); - expression.cast(); + AExpression.Output expressionOutput = expression.analyze(classNode, scriptRoot, scope, expressionInput); + expression.cast(expressionInput, expressionOutput); output.methodEscape = true; output.loopEscape = true; output.allEscape = true; output.statementCount = 1; - return output; - } - - @Override - ThrowNode write(ClassNode classNode) { ThrowNode throwNode = new ThrowNode(); - - throwNode.setExpressionNode(expression.cast(expression.write(classNode))); - + throwNode.setExpressionNode(expression.cast(expressionOutput)); throwNode.setLocation(location); - return throwNode; + output.statementNode = throwNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/STry.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/STry.java index 7798fd13b13a2..b9fd59f6b3fc1 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/STry.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/STry.java @@ -21,10 +21,13 @@ import org.elasticsearch.painless.Location; import org.elasticsearch.painless.Scope; +import org.elasticsearch.painless.ir.BlockNode; +import org.elasticsearch.painless.ir.CatchNode; import org.elasticsearch.painless.ir.ClassNode; import org.elasticsearch.painless.ir.TryNode; import org.elasticsearch.painless.symbol.ScriptRoot; +import java.util.ArrayList; import java.util.Collections; import java.util.List; @@ -33,10 +36,10 @@ /** * Represents the try block as part of a try-catch block. */ -public final class STry extends AStatement { +public class STry extends AStatement { - private final SBlock block; - private final List catches; + protected final SBlock block; + protected final List catches; public STry(Location location, SBlock block, List catches) { super(location); @@ -46,9 +49,8 @@ public STry(Location location, SBlock block, List catches) { } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); if (block == null) { throw createError(new IllegalArgumentException("Extraneous try statement.")); @@ -59,7 +61,7 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { blockInput.inLoop = input.inLoop; blockInput.lastLoop = input.lastLoop; - Output blockOutput = block.analyze(scriptRoot, scope.newLocalScope(), blockInput); + Output blockOutput = block.analyze(classNode, scriptRoot, scope.newLocalScope(), blockInput); output.methodEscape = blockOutput.methodEscape; output.loopEscape = blockOutput.loopEscape; @@ -69,13 +71,15 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { int statementCount = 0; + List catchOutputs = new ArrayList<>(); + for (SCatch catc : catches) { Input catchInput = new Input(); catchInput.lastSource = input.lastSource; catchInput.inLoop = input.inLoop; catchInput.lastLoop = input.lastLoop; - Output catchOutput = catc.analyze(scriptRoot, scope.newLocalScope(), catchInput); + Output catchOutput = catc.analyze(classNode, scriptRoot, scope.newLocalScope(), catchInput); output.methodEscape &= catchOutput.methodEscape; output.loopEscape &= catchOutput.loopEscape; @@ -83,27 +87,25 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { output.anyContinue |= catchOutput.anyContinue; output.anyBreak |= catchOutput.anyBreak; + catchOutputs.add(catchOutput); + statementCount = Math.max(statementCount, catchOutput.statementCount); } output.statementCount = blockOutput.statementCount + statementCount; - return output; - } - - @Override - TryNode write(ClassNode classNode) { TryNode tryNode = new TryNode(); - for (SCatch catc : catches) { - tryNode.addCatchNode(catc.write(classNode)); + for (Output catchOutput : catchOutputs) { + tryNode.addCatchNode((CatchNode)catchOutput.statementNode); } - tryNode.setBlockNode(block.write(classNode)); - + tryNode.setBlockNode((BlockNode)blockOutput.statementNode); tryNode.setLocation(location); - return tryNode; + output.statementNode = tryNode; + + return output; } @Override diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SWhile.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SWhile.java index eead51786e2d3..9fa3139b87eed 100644 --- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SWhile.java +++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/node/SWhile.java @@ -21,6 +21,7 @@ import org.elasticsearch.painless.Location; import org.elasticsearch.painless.Scope; +import org.elasticsearch.painless.ir.BlockNode; import org.elasticsearch.painless.ir.ClassNode; import org.elasticsearch.painless.ir.WhileNode; import org.elasticsearch.painless.symbol.ScriptRoot; @@ -30,12 +31,10 @@ /** * Represents a while loop. */ -public final class SWhile extends AStatement { +public class SWhile extends AStatement { - private AExpression condition; - private final SBlock block; - - private boolean continuous = false; + protected final AExpression condition; + protected final SBlock block; public SWhile(Location location, AExpression condition, SBlock block) { super(location); @@ -45,16 +44,16 @@ public SWhile(Location location, AExpression condition, SBlock block) { } @Override - Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { - this.input = input; - output = new Output(); - + Output analyze(ClassNode classNode, ScriptRoot scriptRoot, Scope scope, Input input) { + Output output = new Output(); scope = scope.newLocalScope(); AExpression.Input conditionInput = new AExpression.Input(); conditionInput.expected = boolean.class; - condition.analyze(scriptRoot, scope, conditionInput); - condition.cast(); + AExpression.Output conditionOutput = condition.analyze(classNode, scriptRoot, scope, conditionInput); + condition.cast(conditionInput, conditionOutput); + + boolean continuous = false; if (condition instanceof EBoolean) { continuous = ((EBoolean)condition).constant; @@ -68,12 +67,14 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { } } + Output blockOutput = null; + if (block != null) { Input blockInput = new Input(); blockInput.beginLoop = true; blockInput.inLoop = true; - Output blockOutput = block.analyze(scriptRoot, scope, blockInput); + blockOutput = block.analyze(classNode, scriptRoot, scope, blockInput); if (blockOutput.loopEscape && blockOutput.anyContinue == false) { throw createError(new IllegalArgumentException("Extraneous while loop.")); @@ -89,20 +90,15 @@ Output analyze(ScriptRoot scriptRoot, Scope scope, Input input) { output.statementCount = 1; - return output; - } - - @Override - WhileNode write(ClassNode classNode) { WhileNode whileNode = new WhileNode(); - - whileNode.setConditionNode(condition.cast(condition.write(classNode))); - whileNode.setBlockNode(block == null ? null : block.write(classNode)); - + whileNode.setConditionNode(condition.cast(conditionOutput)); + whileNode.setBlockNode(blockOutput == null ? null : (BlockNode)blockOutput.statementNode); whileNode.setLocation(location); whileNode.setContinuous(continuous); - return whileNode; + output.statementNode = whileNode; + + return output; } @Override diff --git a/modules/lang-painless/src/test/java/org/elasticsearch/painless/DefOptimizationTests.java b/modules/lang-painless/src/test/java/org/elasticsearch/painless/DefOptimizationTests.java index 0a7c0cdcd0d7e..0cee95adef5fd 100644 --- a/modules/lang-painless/src/test/java/org/elasticsearch/painless/DefOptimizationTests.java +++ b/modules/lang-painless/src/test/java/org/elasticsearch/painless/DefOptimizationTests.java @@ -20,6 +20,7 @@ package org.elasticsearch.painless; public class DefOptimizationTests extends ScriptTestCase { + public void testIntBraceArrayOptiLoad() { final String script = "int x = 0; def y = new int[1]; y[0] = 5; x = y[0]; return x;"; assertBytecodeExists(script, "INVOKEDYNAMIC arrayLoad(Ljava/lang/Object;I)I"); diff --git a/modules/lang-painless/src/test/java/org/elasticsearch/painless/node/NodeToStringTests.java b/modules/lang-painless/src/test/java/org/elasticsearch/painless/node/NodeToStringTests.java index bdc683125eb68..70e754a4168f5 100644 --- a/modules/lang-painless/src/test/java/org/elasticsearch/painless/node/NodeToStringTests.java +++ b/modules/lang-painless/src/test/java/org/elasticsearch/painless/node/NodeToStringTests.java @@ -24,7 +24,6 @@ import org.elasticsearch.painless.FeatureTestObject; import org.elasticsearch.painless.Location; import org.elasticsearch.painless.Operation; -import org.elasticsearch.painless.Scope.Variable; import org.elasticsearch.painless.ScriptClassInfo; import org.elasticsearch.painless.action.PainlessExecuteAction; import org.elasticsearch.painless.antlr.Walker; @@ -749,30 +748,6 @@ public void testSIfElse() { + "}"); } - public void testSSubEachArray() { - Location l = new Location(getTestName(), 0); - Variable v = new Variable(int.class, "test", false); - AExpression e = new ENewArray(l, "int", Arrays.asList(new EConstant(l, 1), new EConstant(l, 2), new EConstant(l, 3)), true); - SBlock b = new SBlock(l, singletonList(new SReturn(l, new EConstant(l, 5)))); - SSubEachArray node = new SSubEachArray(l, v, e, b); - assertEquals( - "(SSubEachArray int test (ENewArray int init (Args (EConstant Integer 1) (EConstant Integer 2) (EConstant Integer 3))) " - + "(SBlock (SReturn (EConstant Integer 5))))", - node.toString()); - } - - public void testSSubEachIterable() { - Location l = new Location(getTestName(), 0); - Variable v = new Variable(int.class, "test", false); - AExpression e = new EListInit(l, Arrays.asList(new EConstant(l, 1), new EConstant(l, 2), new EConstant(l, 3))); - SBlock b = new SBlock(l, singletonList(new SReturn(l, new EConstant(l, 5)))); - SSubEachIterable node = new SSubEachIterable(l, v, e, b); - assertEquals( - "(SSubEachIterable int test (EListInit (EConstant Integer 1) (EConstant Integer 2) (EConstant Integer 3)) (SBlock " - + "(SReturn (EConstant Integer 5))))", - node.toString()); - } - public void testSThrow() { assertToString("(SClass (SThrow (ENewObj RuntimeException)))", "throw new RuntimeException()"); } diff --git a/modules/mapper-extras/build.gradle b/modules/mapper-extras/build.gradle index d4f3b6871a631..4d2e4b7d70ffe 100644 --- a/modules/mapper-extras/build.gradle +++ b/modules/mapper-extras/build.gradle @@ -21,3 +21,9 @@ esplugin { description 'Adds advanced field mappers' classname 'org.elasticsearch.index.mapper.MapperExtrasPlugin' } + +restResources { + restApi { + includeCore '_common', 'cluster', 'nodes', 'indices', 'index', 'search', 'get' + } +} diff --git a/modules/parent-join/build.gradle b/modules/parent-join/build.gradle index 756a65a371a9b..d6cb0c3c3a515 100644 --- a/modules/parent-join/build.gradle +++ b/modules/parent-join/build.gradle @@ -21,3 +21,9 @@ esplugin { description 'This module adds the support parent-child queries and aggregations' classname 'org.elasticsearch.join.ParentJoinPlugin' } + +restResources { + restApi { + includeCore '_common', 'cluster', 'nodes', 'indices', 'index', 'search' + } +} diff --git a/modules/percolator/build.gradle b/modules/percolator/build.gradle index b0cb4390b02bb..ae19823db1542 100644 --- a/modules/percolator/build.gradle +++ b/modules/percolator/build.gradle @@ -26,6 +26,11 @@ dependencies { testCompile project(path: ':modules:parent-join', configuration: 'runtime') } +restResources { + restApi { + includeCore '_common', 'indices', 'index', 'search', 'msearch' + } +} dependencyLicenses { // Don't check the client's license. We know it. dependencies = project.configurations.runtime.fileCollection { diff --git a/modules/percolator/src/main/java/org/elasticsearch/percolator/PercolatorHighlightSubFetchPhase.java b/modules/percolator/src/main/java/org/elasticsearch/percolator/PercolatorHighlightSubFetchPhase.java index d13a54cba0467..cf76f531b6d66 100644 --- a/modules/percolator/src/main/java/org/elasticsearch/percolator/PercolatorHighlightSubFetchPhase.java +++ b/modules/percolator/src/main/java/org/elasticsearch/percolator/PercolatorHighlightSubFetchPhase.java @@ -21,16 +21,11 @@ import org.apache.lucene.index.LeafReaderContext; import org.apache.lucene.index.ReaderUtil; -import org.apache.lucene.search.BooleanClause; -import org.apache.lucene.search.BooleanQuery; -import org.apache.lucene.search.BoostQuery; -import org.apache.lucene.search.ConstantScoreQuery; -import org.apache.lucene.search.DisjunctionMaxQuery; import org.apache.lucene.search.IndexSearcher; import org.apache.lucene.search.Query; +import org.apache.lucene.search.QueryVisitor; import org.elasticsearch.common.bytes.BytesReference; import org.elasticsearch.common.document.DocumentField; -import org.elasticsearch.common.lucene.search.function.FunctionScoreQuery; import org.elasticsearch.index.query.QueryShardContext; import org.elasticsearch.search.SearchHit; import org.elasticsearch.search.fetch.FetchSubPhase; @@ -139,33 +134,18 @@ public void hitsExecute(SearchContext context, SearchHit[] hits) throws IOExcept } static List locatePercolatorQuery(Query query) { - if (query instanceof PercolateQuery) { - return Collections.singletonList((PercolateQuery) query); - } else if (query instanceof BooleanQuery) { - List percolateQueries = new ArrayList<>(); - for (BooleanClause clause : ((BooleanQuery) query).clauses()) { - List result = locatePercolatorQuery(clause.getQuery()); - if (result.isEmpty() == false) { - percolateQueries.addAll(result); - } - } - return percolateQueries; - } else if (query instanceof DisjunctionMaxQuery) { - List percolateQueries = new ArrayList<>(); - for (Query disjunct : ((DisjunctionMaxQuery) query).getDisjuncts()) { - List result = locatePercolatorQuery(disjunct); - if (result.isEmpty() == false) { - percolateQueries.addAll(result); + if (query == null) { + return Collections.emptyList(); + } + List queries = new ArrayList<>(); + query.visit(new QueryVisitor() { + @Override + public void visitLeaf(Query query) { + if (query instanceof PercolateQuery) { + queries.add((PercolateQuery)query); } } - return percolateQueries; - } else if (query instanceof ConstantScoreQuery) { - return locatePercolatorQuery(((ConstantScoreQuery) query).getQuery()); - } else if (query instanceof BoostQuery) { - return locatePercolatorQuery(((BoostQuery) query).getQuery()); - } else if (query instanceof FunctionScoreQuery) { - return locatePercolatorQuery(((FunctionScoreQuery) query).getSubQuery()); - } - return Collections.emptyList(); + }); + return queries; } } diff --git a/modules/percolator/src/test/java/org/elasticsearch/percolator/PercolatorHighlightSubFetchPhaseTests.java b/modules/percolator/src/test/java/org/elasticsearch/percolator/PercolatorHighlightSubFetchPhaseTests.java index 291a42c14665f..09dd2e67e9901 100644 --- a/modules/percolator/src/test/java/org/elasticsearch/percolator/PercolatorHighlightSubFetchPhaseTests.java +++ b/modules/percolator/src/test/java/org/elasticsearch/percolator/PercolatorHighlightSubFetchPhaseTests.java @@ -37,6 +37,7 @@ import java.util.Collections; import static java.util.Collections.emptyMap; +import static org.hamcrest.Matchers.containsInAnyOrder; import static org.hamcrest.Matchers.equalTo; import static org.hamcrest.Matchers.is; import static org.hamcrest.Matchers.sameInstance; @@ -99,8 +100,11 @@ public void testLocatePercolatorQuery() { bq.add(percolateQuery, BooleanClause.Occur.FILTER); bq.add(percolateQuery2, BooleanClause.Occur.FILTER); assertThat(PercolatorHighlightSubFetchPhase.locatePercolatorQuery(bq.build()).size(), equalTo(2)); - assertThat(PercolatorHighlightSubFetchPhase.locatePercolatorQuery(bq.build()).get(0), sameInstance(percolateQuery)); - assertThat(PercolatorHighlightSubFetchPhase.locatePercolatorQuery(bq.build()).get(1), sameInstance(percolateQuery2)); + assertThat(PercolatorHighlightSubFetchPhase.locatePercolatorQuery(bq.build()), + containsInAnyOrder(sameInstance(percolateQuery), sameInstance(percolateQuery2))); + + assertNotNull(PercolatorHighlightSubFetchPhase.locatePercolatorQuery(null)); + assertThat(PercolatorHighlightSubFetchPhase.locatePercolatorQuery(null).size(), equalTo(0)); } } diff --git a/modules/rank-eval/build.gradle b/modules/rank-eval/build.gradle index 35f8fef5176a5..1adcd3016e50d 100644 --- a/modules/rank-eval/build.gradle +++ b/modules/rank-eval/build.gradle @@ -22,6 +22,12 @@ esplugin { classname 'org.elasticsearch.index.rankeval.RankEvalPlugin' } +restResources { + restApi { + includeCore '_common', 'indices', 'index', 'rank_eval' + } +} + testClusters.integTest { // Modules who's integration is explicitly tested in integration tests module file(project(':modules:lang-mustache').tasks.bundlePlugin.archiveFile) diff --git a/modules/reindex/build.gradle b/modules/reindex/build.gradle index e255c2567cc9a..a6b60166c343f 100644 --- a/modules/reindex/build.gradle +++ b/modules/reindex/build.gradle @@ -55,6 +55,13 @@ dependencies { testCompile project(path: ':modules:parent-join', configuration: 'runtime') } +restResources { + restApi { + includeCore '_common', 'cluster', 'nodes', 'indices', 'index', 'get', 'search', 'mget', 'count', + 'update_by_query', 'delete_by_query', 'reindex_rethrottle', 'tasks', 'reindex', 'put_script' + } +} + thirdPartyAudit.ignoreMissingClasses( // Commons logging 'javax.servlet.ServletContextEvent', diff --git a/modules/repository-url/build.gradle b/modules/repository-url/build.gradle index 31fba2964f0c5..cec28b1d00fc8 100644 --- a/modules/repository-url/build.gradle +++ b/modules/repository-url/build.gradle @@ -26,7 +26,13 @@ esplugin { classname 'org.elasticsearch.plugin.repository.url.URLRepositoryPlugin' } -// This directory is shared between two URL repositories and one FS repository in YAML integration tests +restResources { + restApi { + includeCore '_common', 'cluster', 'nodes', 'indices', 'index', 'bulk', 'count', 'snapshot' + } +} + + // This directory is shared between two URL repositories and one FS repository in YAML integration tests File repositoryDir = new File(project.buildDir, "shared-repository") /** A task to start the URLFixture which exposes the repositoryDir over HTTP **/ diff --git a/modules/tasks/src/main/java/org/elasticsearch/tasksplugin/TasksPlugin.java b/modules/tasks/src/main/java/org/elasticsearch/tasksplugin/TasksPlugin.java index 0467b9419c778..b7d63991877db 100644 --- a/modules/tasks/src/main/java/org/elasticsearch/tasksplugin/TasksPlugin.java +++ b/modules/tasks/src/main/java/org/elasticsearch/tasksplugin/TasksPlugin.java @@ -19,7 +19,6 @@ package org.elasticsearch.tasksplugin; -import org.elasticsearch.common.settings.Settings; import org.elasticsearch.indices.SystemIndexDescriptor; import org.elasticsearch.plugins.Plugin; import org.elasticsearch.plugins.SystemIndexPlugin; @@ -35,7 +34,7 @@ public class TasksPlugin extends Plugin implements SystemIndexPlugin { @Override - public Collection getSystemIndexDescriptors(Settings settings) { + public Collection getSystemIndexDescriptors() { return Collections.singletonList(new SystemIndexDescriptor(TASK_INDEX, this.getClass().getSimpleName())); } } diff --git a/modules/tasks/src/test/java/org/elasticsearch/tasksplugin/TasksPluginTests.java b/modules/tasks/src/test/java/org/elasticsearch/tasksplugin/TasksPluginTests.java index 23b873e377eb3..48ec1e06098f3 100644 --- a/modules/tasks/src/test/java/org/elasticsearch/tasksplugin/TasksPluginTests.java +++ b/modules/tasks/src/test/java/org/elasticsearch/tasksplugin/TasksPluginTests.java @@ -19,7 +19,6 @@ package org.elasticsearch.tasksplugin; -import org.elasticsearch.common.settings.Settings; import org.elasticsearch.test.ESTestCase; import org.hamcrest.Matchers; @@ -28,6 +27,6 @@ public class TasksPluginTests extends ESTestCase { public void testDummy() { // This is a dummy test case to satisfy the conventions TasksPlugin plugin = new TasksPlugin(); - assertThat(plugin.getSystemIndexDescriptors(Settings.EMPTY), Matchers.hasSize(1)); + assertThat(plugin.getSystemIndexDescriptors(), Matchers.hasSize(1)); } } diff --git a/modules/transport-netty4/build.gradle b/modules/transport-netty4/build.gradle index 9db16d00e7d61..02386665abb41 100644 --- a/modules/transport-netty4/build.gradle +++ b/modules/transport-netty4/build.gradle @@ -43,6 +43,12 @@ dependencies { compile "io.netty:netty-transport:${versions.netty}" } +restResources { + restApi { + includeCore '_common', 'cluster', 'nodes' + } +} + dependencyLicenses { mapping from: /netty-.*/, to: 'netty' } diff --git a/plugins/analysis-icu/build.gradle b/plugins/analysis-icu/build.gradle index 3fbf7f09b726e..3d1e2909c2291 100644 --- a/plugins/analysis-icu/build.gradle +++ b/plugins/analysis-icu/build.gradle @@ -35,6 +35,12 @@ dependencies { compile "com.ibm.icu:icu4j:${versions.icu4j}" } +restResources { + restApi { + includeCore '_common', 'indices', 'index', 'search' + } +} + dependencyLicenses { mapping from: /lucene-.*/, to: 'lucene' } diff --git a/plugins/analysis-icu/licenses/lucene-analyzers-icu-8.5.0-snapshot-7f057455901.jar.sha1 b/plugins/analysis-icu/licenses/lucene-analyzers-icu-8.5.0-snapshot-7f057455901.jar.sha1 deleted file mode 100644 index f639c17a35f5a..0000000000000 --- a/plugins/analysis-icu/licenses/lucene-analyzers-icu-8.5.0-snapshot-7f057455901.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -b6f880fa08a44fcb2d50808f9eeb6189a293ce27 \ No newline at end of file diff --git a/plugins/analysis-icu/licenses/lucene-analyzers-icu-8.5.0.jar.sha1 b/plugins/analysis-icu/licenses/lucene-analyzers-icu-8.5.0.jar.sha1 new file mode 100644 index 0000000000000..c2f0eb182c26e --- /dev/null +++ b/plugins/analysis-icu/licenses/lucene-analyzers-icu-8.5.0.jar.sha1 @@ -0,0 +1 @@ +0697a7b06e4447be330f093e62d863deaadabc8c \ No newline at end of file diff --git a/plugins/analysis-kuromoji/build.gradle b/plugins/analysis-kuromoji/build.gradle index 333818a8d61b7..85c8b472aceea 100644 --- a/plugins/analysis-kuromoji/build.gradle +++ b/plugins/analysis-kuromoji/build.gradle @@ -26,6 +26,12 @@ dependencies { compile "org.apache.lucene:lucene-analyzers-kuromoji:${versions.lucene}" } +restResources { + restApi { + includeCore '_common', 'indices', 'index', 'search' + } +} + dependencyLicenses { mapping from: /lucene-.*/, to: 'lucene' } diff --git a/plugins/analysis-kuromoji/licenses/lucene-analyzers-kuromoji-8.5.0-snapshot-7f057455901.jar.sha1 b/plugins/analysis-kuromoji/licenses/lucene-analyzers-kuromoji-8.5.0-snapshot-7f057455901.jar.sha1 deleted file mode 100644 index 5c1e0b2f6fa43..0000000000000 --- a/plugins/analysis-kuromoji/licenses/lucene-analyzers-kuromoji-8.5.0-snapshot-7f057455901.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -9c5b8619795f69c225b5ec37b87cb34de0feccd4 \ No newline at end of file diff --git a/plugins/analysis-kuromoji/licenses/lucene-analyzers-kuromoji-8.5.0.jar.sha1 b/plugins/analysis-kuromoji/licenses/lucene-analyzers-kuromoji-8.5.0.jar.sha1 new file mode 100644 index 0000000000000..9f20b12df3341 --- /dev/null +++ b/plugins/analysis-kuromoji/licenses/lucene-analyzers-kuromoji-8.5.0.jar.sha1 @@ -0,0 +1 @@ +b269efbdd16c28525942a27f5738c9d1348b6301 \ No newline at end of file diff --git a/plugins/analysis-nori/build.gradle b/plugins/analysis-nori/build.gradle index a9d3a1126dc58..c732ec9037db9 100644 --- a/plugins/analysis-nori/build.gradle +++ b/plugins/analysis-nori/build.gradle @@ -26,6 +26,11 @@ dependencies { compile "org.apache.lucene:lucene-analyzers-nori:${versions.lucene}" } +restResources { + restApi { + includeCore '_common', 'indices', 'index', 'search' + } +} dependencyLicenses { mapping from: /lucene-.*/, to: 'lucene' } diff --git a/plugins/analysis-nori/licenses/lucene-analyzers-nori-8.5.0-snapshot-7f057455901.jar.sha1 b/plugins/analysis-nori/licenses/lucene-analyzers-nori-8.5.0-snapshot-7f057455901.jar.sha1 deleted file mode 100644 index 656f63c68d194..0000000000000 --- a/plugins/analysis-nori/licenses/lucene-analyzers-nori-8.5.0-snapshot-7f057455901.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -421e13b9fe09523e094ac708204d62d4ea5b6618 \ No newline at end of file diff --git a/plugins/analysis-nori/licenses/lucene-analyzers-nori-8.5.0.jar.sha1 b/plugins/analysis-nori/licenses/lucene-analyzers-nori-8.5.0.jar.sha1 new file mode 100644 index 0000000000000..4ea0c218177ec --- /dev/null +++ b/plugins/analysis-nori/licenses/lucene-analyzers-nori-8.5.0.jar.sha1 @@ -0,0 +1 @@ +056eef8f0a64a70cd9af070ecd8e17d33e55cb75 \ No newline at end of file diff --git a/plugins/analysis-nori/src/main/java/org/elasticsearch/index/analysis/NoriNumberFilterFactory.java b/plugins/analysis-nori/src/main/java/org/elasticsearch/index/analysis/NoriNumberFilterFactory.java new file mode 100644 index 0000000000000..54a5ab9c2124e --- /dev/null +++ b/plugins/analysis-nori/src/main/java/org/elasticsearch/index/analysis/NoriNumberFilterFactory.java @@ -0,0 +1,38 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ + +package org.elasticsearch.index.analysis; + +import org.apache.lucene.analysis.TokenStream; +import org.apache.lucene.analysis.ko.KoreanNumberFilter; +import org.elasticsearch.common.settings.Settings; +import org.elasticsearch.env.Environment; +import org.elasticsearch.index.IndexSettings; + +public class NoriNumberFilterFactory extends AbstractTokenFilterFactory { + + public NoriNumberFilterFactory(IndexSettings indexSettings, Environment environment, String name, Settings settings) { + super(indexSettings, name, settings); + } + + @Override + public TokenStream create(TokenStream tokenStream) { + return new KoreanNumberFilter(tokenStream); + } +} diff --git a/plugins/analysis-nori/src/main/java/org/elasticsearch/index/analysis/NoriTokenizerFactory.java b/plugins/analysis-nori/src/main/java/org/elasticsearch/index/analysis/NoriTokenizerFactory.java index bac5dd2a77065..9680d6fd5f80a 100644 --- a/plugins/analysis-nori/src/main/java/org/elasticsearch/index/analysis/NoriTokenizerFactory.java +++ b/plugins/analysis-nori/src/main/java/org/elasticsearch/index/analysis/NoriTokenizerFactory.java @@ -39,11 +39,13 @@ public class NoriTokenizerFactory extends AbstractTokenizerFactory { private final UserDictionary userDictionary; private final KoreanTokenizer.DecompoundMode decompoundMode; + private final boolean discardPunctuation; public NoriTokenizerFactory(IndexSettings indexSettings, Environment env, String name, Settings settings) { super(indexSettings, settings, name); decompoundMode = getMode(settings); userDictionary = getUserDictionary(env, settings); + discardPunctuation = settings.getAsBoolean("discard_punctuation", true); } public static UserDictionary getUserDictionary(Environment env, Settings settings) { @@ -77,7 +79,8 @@ public static KoreanTokenizer.DecompoundMode getMode(Settings settings) { @Override public Tokenizer create() { - return new KoreanTokenizer(KoreanTokenizer.DEFAULT_TOKEN_ATTRIBUTE_FACTORY, userDictionary, decompoundMode, false); + return new KoreanTokenizer(KoreanTokenizer.DEFAULT_TOKEN_ATTRIBUTE_FACTORY, userDictionary, decompoundMode, false, + discardPunctuation); } } diff --git a/plugins/analysis-nori/src/main/java/org/elasticsearch/plugin/analysis/nori/AnalysisNoriPlugin.java b/plugins/analysis-nori/src/main/java/org/elasticsearch/plugin/analysis/nori/AnalysisNoriPlugin.java index 6e9baa7acd26c..72097e2e83472 100644 --- a/plugins/analysis-nori/src/main/java/org/elasticsearch/plugin/analysis/nori/AnalysisNoriPlugin.java +++ b/plugins/analysis-nori/src/main/java/org/elasticsearch/plugin/analysis/nori/AnalysisNoriPlugin.java @@ -22,6 +22,7 @@ import org.apache.lucene.analysis.Analyzer; import org.elasticsearch.index.analysis.AnalyzerProvider; import org.elasticsearch.index.analysis.NoriAnalyzerProvider; +import org.elasticsearch.index.analysis.NoriNumberFilterFactory; import org.elasticsearch.index.analysis.NoriPartOfSpeechStopFilterFactory; import org.elasticsearch.index.analysis.NoriReadingFormFilterFactory; import org.elasticsearch.index.analysis.NoriTokenizerFactory; @@ -42,6 +43,7 @@ public Map> getTokenFilters() { Map> extra = new HashMap<>(); extra.put("nori_part_of_speech", NoriPartOfSpeechStopFilterFactory::new); extra.put("nori_readingform", NoriReadingFormFilterFactory::new); + extra.put("nori_number", NoriNumberFilterFactory::new); return extra; } diff --git a/plugins/analysis-nori/src/test/java/org/elasticsearch/index/analysis/AnalysisNoriFactoryTests.java b/plugins/analysis-nori/src/test/java/org/elasticsearch/index/analysis/AnalysisNoriFactoryTests.java index 1677ba94b8783..de70e26fc6f01 100644 --- a/plugins/analysis-nori/src/test/java/org/elasticsearch/index/analysis/AnalysisNoriFactoryTests.java +++ b/plugins/analysis-nori/src/test/java/org/elasticsearch/index/analysis/AnalysisNoriFactoryTests.java @@ -43,6 +43,7 @@ protected Map> getTokenFilters() { Map> filters = new HashMap<>(super.getTokenFilters()); filters.put("koreanpartofspeechstop", NoriPartOfSpeechStopFilterFactory.class); filters.put("koreanreadingform", NoriReadingFormFilterFactory.class); + filters.put("koreannumber", NoriNumberFilterFactory.class); return filters; } } diff --git a/plugins/analysis-nori/src/test/java/org/elasticsearch/index/analysis/NoriAnalysisTests.java b/plugins/analysis-nori/src/test/java/org/elasticsearch/index/analysis/NoriAnalysisTests.java index 051a2f3e4dc32..87c78c7f981b9 100644 --- a/plugins/analysis-nori/src/test/java/org/elasticsearch/index/analysis/NoriAnalysisTests.java +++ b/plugins/analysis-nori/src/test/java/org/elasticsearch/index/analysis/NoriAnalysisTests.java @@ -54,6 +54,9 @@ public void testDefaultsNoriAnalysis() throws IOException { filterFactory = analysis.tokenFilter.get("nori_readingform"); assertThat(filterFactory, instanceOf(NoriReadingFormFilterFactory.class)); + filterFactory = analysis.tokenFilter.get("nori_number"); + assertThat(filterFactory, instanceOf(NoriNumberFilterFactory.class)); + IndexAnalyzers indexAnalyzers = analysis.indexAnalyzers; NamedAnalyzer analyzer = indexAnalyzers.get("nori"); assertThat(analyzer.analyzer(), instanceOf(KoreanAnalyzer.class)); @@ -130,6 +133,33 @@ public void testNoriTokenizer() throws Exception { assertTokenStreamContents(tokenizer, new String[] {"뿌리", "가", "깊", "은", "나무"}); tokenizer.setReader(new StringReader("가늠표")); assertTokenStreamContents(tokenizer, new String[] {"가늠표", "가늠", "표"}); + // discard_punctuation default(true) + tokenizer.setReader(new StringReader("3.2개")); + assertTokenStreamContents(tokenizer, new String[] {"3", "2", "개"}); + } + + public void testNoriTokenizerDiscardPunctuationOptionTrue() throws Exception { + Settings settings = createDiscardPunctuationOption("true"); + TestAnalysis analysis = createTestAnalysis(settings); + Tokenizer tokenizer = analysis.tokenizer.get("my_tokenizer").create(); + tokenizer.setReader(new StringReader("3.2개")); + assertTokenStreamContents(tokenizer, new String[] {"3", "2", "개"}); + } + + public void testNoriTokenizerDiscardPunctuationOptionFalse() throws Exception { + Settings settings = createDiscardPunctuationOption("false"); + TestAnalysis analysis = createTestAnalysis(settings); + Tokenizer tokenizer = analysis.tokenizer.get("my_tokenizer").create(); + tokenizer.setReader(new StringReader("3.2개")); + assertTokenStreamContents(tokenizer, new String[] {"3", ".", "2", "개"}); + } + + public void testNoriTokenizerInvalidDiscardPunctuationOption() { + String wrongOption = "wrong"; + Settings settings = createDiscardPunctuationOption(wrongOption); + IllegalArgumentException exc = expectThrows(IllegalArgumentException.class, () -> createTestAnalysis(settings)); + assertThat(exc.getMessage(), containsString("Failed to parse value [" + wrongOption + + "] as only [true] or [false] are allowed.")); } public void testNoriPartOfSpeech() throws IOException { @@ -159,6 +189,27 @@ public void testNoriReadingForm() throws IOException { assertTokenStreamContents(stream, new String[] {"향가"}); } + public void testNoriNumber() throws IOException { + Settings settings = Settings.builder() + .put(IndexMetaData.SETTING_VERSION_CREATED, Version.CURRENT) + .put(Environment.PATH_HOME_SETTING.getKey(), createTempDir().toString()) + .put("index.analysis.filter.my_filter.type", "nori_number") + .build(); + TestAnalysis analysis = AnalysisTestsHelper.createTestAnalysisFromSettings(settings, new AnalysisNoriPlugin()); + TokenFilterFactory factory = analysis.tokenFilter.get("my_filter"); + Tokenizer tokenizer = new KoreanTokenizer(); + tokenizer.setReader(new StringReader("오늘 십만이천오백원짜리 와인 구입")); + TokenStream stream = factory.create(tokenizer); + assertTokenStreamContents(stream, new String[] {"오늘", "102500", "원", "짜리", "와인", "구입"}); + } + + private Settings createDiscardPunctuationOption(String option) { + return Settings.builder() + .put("index.analysis.tokenizer.my_tokenizer.type", "nori_tokenizer") + .put("index.analysis.tokenizer.my_tokenizer.discard_punctuation", option) + .build(); + } + private TestAnalysis createTestAnalysis(Settings analysisSettings) throws IOException { InputStream dict = NoriAnalysisTests.class.getResourceAsStream("user_dict.txt"); Path home = createTempDir(); diff --git a/plugins/analysis-nori/src/test/resources/rest-api-spec/test/analysis_nori/10_basic.yml b/plugins/analysis-nori/src/test/resources/rest-api-spec/test/analysis_nori/10_basic.yml index a5aa9998da6ba..523874f5743bb 100644 --- a/plugins/analysis-nori/src/test/resources/rest-api-spec/test/analysis_nori/10_basic.yml +++ b/plugins/analysis-nori/src/test/resources/rest-api-spec/test/analysis_nori/10_basic.yml @@ -46,3 +46,20 @@ filter: [nori_readingform] - length: { tokens: 1 } - match: { tokens.0.token: 향가 } +--- +"Number filter": + - do: + indices.analyze: + body: + text: 십만이천오백과 3.2천 + tokenizer: + type: nori_tokenizer + discard_punctuation: false + filter: + - type: nori_part_of_speech + stoptags: ["SP"] + - type: nori_number + - length: { tokens: 3 } + - match: { tokens.0.token: "102500"} + - match: { tokens.1.token: 과} + - match: { tokens.2.token: "3200"} diff --git a/plugins/analysis-phonetic/build.gradle b/plugins/analysis-phonetic/build.gradle index 61c4fdbd58394..82091b91d9ead 100644 --- a/plugins/analysis-phonetic/build.gradle +++ b/plugins/analysis-phonetic/build.gradle @@ -27,6 +27,12 @@ dependencies { compile "commons-codec:commons-codec:${versions.commonscodec}" } +restResources { + restApi { + includeCore '_common', 'indices', 'index', 'search' + } +} + dependencyLicenses { mapping from: /lucene-.*/, to: 'lucene' } diff --git a/plugins/analysis-phonetic/licenses/lucene-analyzers-phonetic-8.5.0-snapshot-7f057455901.jar.sha1 b/plugins/analysis-phonetic/licenses/lucene-analyzers-phonetic-8.5.0-snapshot-7f057455901.jar.sha1 deleted file mode 100644 index 951d97032f9fb..0000000000000 --- a/plugins/analysis-phonetic/licenses/lucene-analyzers-phonetic-8.5.0-snapshot-7f057455901.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -ff4ae9f3f3b0bc497f98c9bc47e943525669fc99 \ No newline at end of file diff --git a/plugins/analysis-phonetic/licenses/lucene-analyzers-phonetic-8.5.0.jar.sha1 b/plugins/analysis-phonetic/licenses/lucene-analyzers-phonetic-8.5.0.jar.sha1 new file mode 100644 index 0000000000000..b0e0f037d5571 --- /dev/null +++ b/plugins/analysis-phonetic/licenses/lucene-analyzers-phonetic-8.5.0.jar.sha1 @@ -0,0 +1 @@ +356e39f7b4b0cf8bc6a766a54ab6b93b11c89f6d \ No newline at end of file diff --git a/plugins/analysis-smartcn/build.gradle b/plugins/analysis-smartcn/build.gradle index ebe44850d00a7..a6d287ef690de 100644 --- a/plugins/analysis-smartcn/build.gradle +++ b/plugins/analysis-smartcn/build.gradle @@ -26,6 +26,12 @@ dependencies { compile "org.apache.lucene:lucene-analyzers-smartcn:${versions.lucene}" } +restResources { + restApi { + includeCore '_common', 'indices', 'index', 'search' + } +} + dependencyLicenses { mapping from: /lucene-.*/, to: 'lucene' } diff --git a/plugins/analysis-smartcn/licenses/lucene-analyzers-smartcn-8.5.0-snapshot-7f057455901.jar.sha1 b/plugins/analysis-smartcn/licenses/lucene-analyzers-smartcn-8.5.0-snapshot-7f057455901.jar.sha1 deleted file mode 100644 index 19846e43636c3..0000000000000 --- a/plugins/analysis-smartcn/licenses/lucene-analyzers-smartcn-8.5.0-snapshot-7f057455901.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -dd6430c037566cd3852b73b2ec31e59de24cfe58 \ No newline at end of file diff --git a/plugins/analysis-smartcn/licenses/lucene-analyzers-smartcn-8.5.0.jar.sha1 b/plugins/analysis-smartcn/licenses/lucene-analyzers-smartcn-8.5.0.jar.sha1 new file mode 100644 index 0000000000000..faa11d7a87fb3 --- /dev/null +++ b/plugins/analysis-smartcn/licenses/lucene-analyzers-smartcn-8.5.0.jar.sha1 @@ -0,0 +1 @@ +6139d1cc3f51d6f5fc5ab9976dc5376b682d8332 \ No newline at end of file diff --git a/plugins/analysis-stempel/build.gradle b/plugins/analysis-stempel/build.gradle index 488e99ec912bd..7f90e10817364 100644 --- a/plugins/analysis-stempel/build.gradle +++ b/plugins/analysis-stempel/build.gradle @@ -26,6 +26,12 @@ dependencies { compile "org.apache.lucene:lucene-analyzers-stempel:${versions.lucene}" } +restResources { + restApi { + includeCore '_common', 'indices', 'index', 'search' + } +} + dependencyLicenses { mapping from: /lucene-.*/, to: 'lucene' } diff --git a/plugins/analysis-stempel/licenses/lucene-analyzers-stempel-8.5.0-snapshot-7f057455901.jar.sha1 b/plugins/analysis-stempel/licenses/lucene-analyzers-stempel-8.5.0-snapshot-7f057455901.jar.sha1 deleted file mode 100644 index 147bbb3192d24..0000000000000 --- a/plugins/analysis-stempel/licenses/lucene-analyzers-stempel-8.5.0-snapshot-7f057455901.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -dd4ca22b151a98a21e255bc1c54f0fadfee5ca4d \ No newline at end of file diff --git a/plugins/analysis-stempel/licenses/lucene-analyzers-stempel-8.5.0.jar.sha1 b/plugins/analysis-stempel/licenses/lucene-analyzers-stempel-8.5.0.jar.sha1 new file mode 100644 index 0000000000000..62fe533fde854 --- /dev/null +++ b/plugins/analysis-stempel/licenses/lucene-analyzers-stempel-8.5.0.jar.sha1 @@ -0,0 +1 @@ +f6c1757e23f0cbad2e4d2eb39a12d9deff2e802b \ No newline at end of file diff --git a/plugins/analysis-ukrainian/build.gradle b/plugins/analysis-ukrainian/build.gradle index 0e254e5f05026..cec538d7d79a8 100644 --- a/plugins/analysis-ukrainian/build.gradle +++ b/plugins/analysis-ukrainian/build.gradle @@ -29,6 +29,12 @@ dependencies { compile "ua.net.nlp:morfologik-ukrainian-search:3.7.5" } +restResources { + restApi { + includeCore '_common', 'indices', 'index', 'search' + } +} + dependencyLicenses { mapping from: /lucene-.*/, to: 'lucene' mapping from: /morfologik-.*/, to: 'lucene' diff --git a/plugins/analysis-ukrainian/licenses/lucene-analyzers-morfologik-8.5.0-snapshot-7f057455901.jar.sha1 b/plugins/analysis-ukrainian/licenses/lucene-analyzers-morfologik-8.5.0-snapshot-7f057455901.jar.sha1 deleted file mode 100644 index 86cee0ecdd039..0000000000000 --- a/plugins/analysis-ukrainian/licenses/lucene-analyzers-morfologik-8.5.0-snapshot-7f057455901.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -e85f94d2747ddb560af0bc4d15f0cde45cf3ff30 \ No newline at end of file diff --git a/plugins/analysis-ukrainian/licenses/lucene-analyzers-morfologik-8.5.0.jar.sha1 b/plugins/analysis-ukrainian/licenses/lucene-analyzers-morfologik-8.5.0.jar.sha1 new file mode 100644 index 0000000000000..52115dc4ff819 --- /dev/null +++ b/plugins/analysis-ukrainian/licenses/lucene-analyzers-morfologik-8.5.0.jar.sha1 @@ -0,0 +1 @@ +e647c3158a092df07ba4ac1b827623f45176ef48 \ No newline at end of file diff --git a/plugins/discovery-azure-classic/build.gradle b/plugins/discovery-azure-classic/build.gradle index 257777526c92b..bbad69232d0fa 100644 --- a/plugins/discovery-azure-classic/build.gradle +++ b/plugins/discovery-azure-classic/build.gradle @@ -57,6 +57,11 @@ dependencies { compile 'javax.xml.bind:jaxb-api:2.2.2' } +restResources { + restApi { + includeCore '_common', 'cluster', 'nodes' + } +} // needed to be consistent with ssl host checking String host = InetAddress.getLoopbackAddress().getHostAddress() diff --git a/plugins/discovery-ec2/build.gradle b/plugins/discovery-ec2/build.gradle index a9ae35393de16..c45d0f9f457c0 100644 --- a/plugins/discovery-ec2/build.gradle +++ b/plugins/discovery-ec2/build.gradle @@ -25,7 +25,7 @@ esplugin { } versions << [ - 'aws': '1.11.636' + 'aws': '1.11.749' ] dependencies { @@ -36,10 +36,16 @@ dependencies { compile "commons-logging:commons-logging:${versions.commonslogging}" compile "org.apache.logging.log4j:log4j-1.2-api:${versions.log4j}" compile "commons-codec:commons-codec:${versions.commonscodec}" - compile "com.fasterxml.jackson.core:jackson-databind:${versions.jacksondatabind}" + compile "com.fasterxml.jackson.core:jackson-databind:${versions.jackson}" compile "com.fasterxml.jackson.core:jackson-annotations:${versions.jackson}" } +restResources { + restApi { + includeCore '_common', 'cluster', 'nodes' + } +} + dependencyLicenses { mapping from: /aws-java-sdk-.*/, to: 'aws-java-sdk' mapping from: /jackson-.*/, to: 'jackson' diff --git a/plugins/discovery-ec2/licenses/aws-java-sdk-core-1.11.636.jar.sha1 b/plugins/discovery-ec2/licenses/aws-java-sdk-core-1.11.636.jar.sha1 deleted file mode 100644 index b9ee9c102dbcb..0000000000000 --- a/plugins/discovery-ec2/licenses/aws-java-sdk-core-1.11.636.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -84c9f180f8f60f6f1433c9c5253fcb704593b121 \ No newline at end of file diff --git a/plugins/discovery-ec2/licenses/aws-java-sdk-core-1.11.749.jar.sha1 b/plugins/discovery-ec2/licenses/aws-java-sdk-core-1.11.749.jar.sha1 new file mode 100644 index 0000000000000..7bc18d6d4f681 --- /dev/null +++ b/plugins/discovery-ec2/licenses/aws-java-sdk-core-1.11.749.jar.sha1 @@ -0,0 +1 @@ +1da5c1549295cfeebc67fc1c7539785a9441755b \ No newline at end of file diff --git a/plugins/discovery-ec2/licenses/aws-java-sdk-ec2-1.11.636.jar.sha1 b/plugins/discovery-ec2/licenses/aws-java-sdk-ec2-1.11.636.jar.sha1 deleted file mode 100644 index ed737c808c1de..0000000000000 --- a/plugins/discovery-ec2/licenses/aws-java-sdk-ec2-1.11.636.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -d32fc4ae314dbee9717302a3119cba0f735c04b1 \ No newline at end of file diff --git a/plugins/discovery-ec2/licenses/aws-java-sdk-ec2-1.11.749.jar.sha1 b/plugins/discovery-ec2/licenses/aws-java-sdk-ec2-1.11.749.jar.sha1 new file mode 100644 index 0000000000000..c7c7220005fc3 --- /dev/null +++ b/plugins/discovery-ec2/licenses/aws-java-sdk-ec2-1.11.749.jar.sha1 @@ -0,0 +1 @@ +0865e0937c6500acf62ce9c8964eac76a8718f5f \ No newline at end of file diff --git a/plugins/discovery-ec2/licenses/jackson-annotations-2.10.3.jar.sha1 b/plugins/discovery-ec2/licenses/jackson-annotations-2.10.3.jar.sha1 new file mode 100644 index 0000000000000..9c725f2d90e69 --- /dev/null +++ b/plugins/discovery-ec2/licenses/jackson-annotations-2.10.3.jar.sha1 @@ -0,0 +1 @@ +0f63b3b1da563767d04d2e4d3fc1ae0cdeffebe7 \ No newline at end of file diff --git a/plugins/discovery-ec2/licenses/jackson-annotations-2.8.11.jar.sha1 b/plugins/discovery-ec2/licenses/jackson-annotations-2.8.11.jar.sha1 deleted file mode 100644 index 30e7d1a7b1a74..0000000000000 --- a/plugins/discovery-ec2/licenses/jackson-annotations-2.8.11.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -391de20b4e29cb3fb07d2454ace64be2c82ac91f \ No newline at end of file diff --git a/plugins/discovery-ec2/licenses/jackson-databind-2.10.3.jar.sha1 b/plugins/discovery-ec2/licenses/jackson-databind-2.10.3.jar.sha1 new file mode 100644 index 0000000000000..688ae92d10792 --- /dev/null +++ b/plugins/discovery-ec2/licenses/jackson-databind-2.10.3.jar.sha1 @@ -0,0 +1 @@ +aae92628b5447fa25af79871ca98668da6edd439 \ No newline at end of file diff --git a/plugins/discovery-ec2/licenses/jackson-databind-2.8.11.6.jar.sha1 b/plugins/discovery-ec2/licenses/jackson-databind-2.8.11.6.jar.sha1 deleted file mode 100644 index f491259db56bc..0000000000000 --- a/plugins/discovery-ec2/licenses/jackson-databind-2.8.11.6.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -35753201d0cdb1dbe998ab289bca1180b68d4368 \ No newline at end of file diff --git a/plugins/discovery-ec2/qa/amazon-ec2/build.gradle b/plugins/discovery-ec2/qa/amazon-ec2/build.gradle index 023c864d62e7c..a10d9feccfdce 100644 --- a/plugins/discovery-ec2/qa/amazon-ec2/build.gradle +++ b/plugins/discovery-ec2/qa/amazon-ec2/build.gradle @@ -32,6 +32,12 @@ dependencies { testCompile project(path: ':plugins:discovery-ec2', configuration: 'runtime') } +restResources { + restApi { + includeCore '_common', 'cluster', 'nodes' + } +} + final int ec2NumberOfNodes = 3 Map expansions = [ diff --git a/plugins/discovery-ec2/qa/amazon-ec2/src/test/java/org/elasticsearch/discovery/ec2/AmazonEC2Fixture.java b/plugins/discovery-ec2/qa/amazon-ec2/src/test/java/org/elasticsearch/discovery/ec2/AmazonEC2Fixture.java index 32abcdc43e645..ecaaceea5cc26 100644 --- a/plugins/discovery-ec2/qa/amazon-ec2/src/test/java/org/elasticsearch/discovery/ec2/AmazonEC2Fixture.java +++ b/plugins/discovery-ec2/qa/amazon-ec2/src/test/java/org/elasticsearch/discovery/ec2/AmazonEC2Fixture.java @@ -22,6 +22,7 @@ import org.apache.http.NameValuePair; import org.apache.http.client.methods.HttpGet; import org.apache.http.client.methods.HttpPost; +import org.apache.http.client.methods.HttpPut; import org.apache.http.client.utils.URLEncodedUtils; import org.elasticsearch.common.Booleans; import org.elasticsearch.common.SuppressForbidden; @@ -106,6 +107,13 @@ protected Response handle(final Request request) throws IOException { return new Response(RestStatus.OK.getStatus(), headers, "my_iam_profile".getBytes(UTF_8)); } + if (instanceProfile && "/latest/api/token".equals(request.getPath()) + && HttpPut.METHOD_NAME.equals(request.getMethod())) { + // TODO: Implement IMDSv2 behavior here. For now this just returns a 403 which makes the SDK fall back to IMDSv1 + // which is implemented in this fixture + return new Response(RestStatus.FORBIDDEN.getStatus(), TEXT_PLAIN_CONTENT_TYPE, EMPTY_BYTE); + } + if ((containerCredentials && "/ecs_credentials_endpoint".equals(request.getPath()) && HttpGet.METHOD_NAME.equals(request.getMethod())) || diff --git a/plugins/discovery-ec2/src/main/java/org/elasticsearch/discovery/ec2/AwsEc2ServiceImpl.java b/plugins/discovery-ec2/src/main/java/org/elasticsearch/discovery/ec2/AwsEc2ServiceImpl.java index ac18775fd1592..1d6f69927ab6d 100644 --- a/plugins/discovery-ec2/src/main/java/org/elasticsearch/discovery/ec2/AwsEc2ServiceImpl.java +++ b/plugins/discovery-ec2/src/main/java/org/elasticsearch/discovery/ec2/AwsEc2ServiceImpl.java @@ -24,9 +24,10 @@ import com.amazonaws.auth.AWSCredentialsProvider; import com.amazonaws.auth.AWSStaticCredentialsProvider; import com.amazonaws.auth.DefaultAWSCredentialsProviderChain; +import com.amazonaws.client.builder.AwsClientBuilder; import com.amazonaws.http.IdleConnectionReaper; import com.amazonaws.services.ec2.AmazonEC2; -import com.amazonaws.services.ec2.AmazonEC2Client; +import com.amazonaws.services.ec2.AmazonEC2ClientBuilder; import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; import org.elasticsearch.ElasticsearchException; @@ -45,17 +46,18 @@ class AwsEc2ServiceImpl implements AwsEc2Service { private AmazonEC2 buildClient(Ec2ClientSettings clientSettings) { final AWSCredentialsProvider credentials = buildCredentials(logger, clientSettings); final ClientConfiguration configuration = buildConfiguration(clientSettings); - final AmazonEC2 client = buildClient(credentials, configuration); - if (Strings.hasText(clientSettings.endpoint)) { - logger.debug("using explicit ec2 endpoint [{}]", clientSettings.endpoint); - client.setEndpoint(clientSettings.endpoint); - } - return client; + return buildClient(credentials, configuration, clientSettings.endpoint); } // proxy for testing - AmazonEC2 buildClient(AWSCredentialsProvider credentials, ClientConfiguration configuration) { - return new AmazonEC2Client(credentials, configuration); + AmazonEC2 buildClient(AWSCredentialsProvider credentials, ClientConfiguration configuration, String endpoint) { + final AmazonEC2ClientBuilder builder = AmazonEC2ClientBuilder.standard().withCredentials(credentials) + .withClientConfiguration(configuration); + if (Strings.hasText(endpoint)) { + logger.debug("using explicit ec2 endpoint [{}]", endpoint); + builder.withEndpointConfiguration(new AwsClientBuilder.EndpointConfiguration(endpoint, null)); + } + return SocketAccess.doPrivileged(builder::build); } // pkg private for tests diff --git a/plugins/discovery-ec2/src/test/java/org/elasticsearch/discovery/ec2/Ec2DiscoveryPluginTests.java b/plugins/discovery-ec2/src/test/java/org/elasticsearch/discovery/ec2/Ec2DiscoveryPluginTests.java index 661b5815b4029..2e53a5f614101 100644 --- a/plugins/discovery-ec2/src/test/java/org/elasticsearch/discovery/ec2/Ec2DiscoveryPluginTests.java +++ b/plugins/discovery-ec2/src/test/java/org/elasticsearch/discovery/ec2/Ec2DiscoveryPluginTests.java @@ -39,7 +39,6 @@ import static org.hamcrest.Matchers.instanceOf; import static org.hamcrest.Matchers.is; -import static org.hamcrest.Matchers.nullValue; public class Ec2DiscoveryPluginTests extends ESTestCase { @@ -96,7 +95,7 @@ public void testNodeAttributesErrorLenient() throws Exception { public void testDefaultEndpoint() throws IOException { try (Ec2DiscoveryPluginMock plugin = new Ec2DiscoveryPluginMock(Settings.EMPTY)) { final String endpoint = ((AmazonEC2Mock) plugin.ec2Service.client().client()).endpoint; - assertThat(endpoint, nullValue()); + assertThat(endpoint, is("")); } } @@ -199,8 +198,9 @@ private static class Ec2DiscoveryPluginMock extends Ec2DiscoveryPlugin { Ec2DiscoveryPluginMock(Settings settings) { super(settings, new AwsEc2ServiceImpl() { @Override - AmazonEC2 buildClient(AWSCredentialsProvider credentials, ClientConfiguration configuration) { - return new AmazonEC2Mock(credentials, configuration); + AmazonEC2 buildClient(AWSCredentialsProvider credentials, ClientConfiguration configuration, + String endpoint) { + return new AmazonEC2Mock(credentials, configuration, endpoint); } }); } @@ -212,13 +212,9 @@ private static class AmazonEC2Mock extends AbstractAmazonEC2 { final AWSCredentialsProvider credentials; final ClientConfiguration configuration; - AmazonEC2Mock(AWSCredentialsProvider credentials, ClientConfiguration configuration) { + AmazonEC2Mock(AWSCredentialsProvider credentials, ClientConfiguration configuration, String endpoint) { this.credentials = credentials; this.configuration = configuration; - } - - @Override - public void setEndpoint(String endpoint) throws IllegalArgumentException { this.endpoint = endpoint; } diff --git a/plugins/discovery-gce/build.gradle b/plugins/discovery-gce/build.gradle index 7e5c29ff5c33c..275eb9312a30b 100644 --- a/plugins/discovery-gce/build.gradle +++ b/plugins/discovery-gce/build.gradle @@ -21,6 +21,12 @@ dependencies { compile "commons-codec:commons-codec:${versions.commonscodec}" } +restResources { + restApi { + includeCore '_common', 'cluster', 'nodes' + } +} + dependencyLicenses { mapping from: /google-.*/, to: 'google' } diff --git a/plugins/discovery-gce/qa/gce/build.gradle b/plugins/discovery-gce/qa/gce/build.gradle index 3a5bf84c7ba93..7eb2fe13aae8f 100644 --- a/plugins/discovery-gce/qa/gce/build.gradle +++ b/plugins/discovery-gce/qa/gce/build.gradle @@ -33,6 +33,12 @@ dependencies { testCompile project(path: ':plugins:discovery-gce', configuration: 'runtime') } +restResources { + restApi { + includeCore '_common', 'cluster', 'nodes' + } +} + /** A task to start the GCEFixture which emulates a GCE service **/ task gceFixture(type: AntFixture) { dependsOn compileTestJava diff --git a/plugins/ingest-attachment/build.gradle b/plugins/ingest-attachment/build.gradle index 0c5d5deaf3f04..668c4c397e814 100644 --- a/plugins/ingest-attachment/build.gradle +++ b/plugins/ingest-attachment/build.gradle @@ -72,6 +72,12 @@ dependencies { compile 'org.apache.commons:commons-lang3:3.9' } +restResources { + restApi { + includeCore '_common', 'cluster', 'nodes', 'ingest', 'index', 'get' + } +} + dependencyLicenses { mapping from: /apache-mime4j-.*/, to: 'apache-mime4j' } diff --git a/plugins/mapper-annotated-text/build.gradle b/plugins/mapper-annotated-text/build.gradle index 8ce1ca2a416fe..2ae9a9278b0a5 100644 --- a/plugins/mapper-annotated-text/build.gradle +++ b/plugins/mapper-annotated-text/build.gradle @@ -21,3 +21,9 @@ esplugin { description 'The Mapper Annotated_text plugin adds support for text fields with markup used to inject annotation tokens into the index.' classname 'org.elasticsearch.plugin.mapper.AnnotatedTextPlugin' } + +restResources { + restApi { + includeCore '_common', 'indices', 'index', 'search' + } +} diff --git a/plugins/mapper-murmur3/build.gradle b/plugins/mapper-murmur3/build.gradle index 5b985d9138f52..f556e82db3867 100644 --- a/plugins/mapper-murmur3/build.gradle +++ b/plugins/mapper-murmur3/build.gradle @@ -21,3 +21,9 @@ esplugin { description 'The Mapper Murmur3 plugin allows to compute hashes of a field\'s values at index-time and to store them in the index.' classname 'org.elasticsearch.plugin.mapper.MapperMurmur3Plugin' } + +restResources { + restApi { + includeCore '_common', 'indices', 'index', 'search' + } +} diff --git a/plugins/mapper-size/build.gradle b/plugins/mapper-size/build.gradle index 5a49ce5d04b78..0c757b4966bd3 100644 --- a/plugins/mapper-size/build.gradle +++ b/plugins/mapper-size/build.gradle @@ -21,3 +21,9 @@ esplugin { description 'The Mapper Size plugin allows document to record their uncompressed size at index time.' classname 'org.elasticsearch.plugin.mapper.MapperSizePlugin' } + +restResources { + restApi { + includeCore '_common', 'indices', 'index', 'get' + } +} diff --git a/plugins/repository-azure/build.gradle b/plugins/repository-azure/build.gradle index 5a11f8b02bf84..418eccc87ecfa 100644 --- a/plugins/repository-azure/build.gradle +++ b/plugins/repository-azure/build.gradle @@ -25,13 +25,19 @@ esplugin { } dependencies { - compile 'com.microsoft.azure:azure-storage:8.4.0' + compile 'com.microsoft.azure:azure-storage:8.6.2' compile 'com.microsoft.azure:azure-keyvault-core:1.0.0' compile 'com.google.guava:guava:20.0' compile 'org.apache.commons:commons-lang3:3.4' testCompile project(':test:fixtures:azure-fixture') } +restResources { + restApi { + includeCore '_common', 'cluster', 'nodes' + } +} + dependencyLicenses { mapping from: /azure-.*/, to: 'azure' mapping from: /jackson-.*/, to: 'jackson' diff --git a/plugins/repository-azure/licenses/azure-storage-8.4.0.jar.sha1 b/plugins/repository-azure/licenses/azure-storage-8.4.0.jar.sha1 deleted file mode 100644 index db3b2baba0644..0000000000000 --- a/plugins/repository-azure/licenses/azure-storage-8.4.0.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -002c6b7827f06869b8d04880bf913ce4efcc9ad4 \ No newline at end of file diff --git a/plugins/repository-azure/licenses/azure-storage-8.6.2.jar.sha1 b/plugins/repository-azure/licenses/azure-storage-8.6.2.jar.sha1 new file mode 100644 index 0000000000000..0f8f24231fbdb --- /dev/null +++ b/plugins/repository-azure/licenses/azure-storage-8.6.2.jar.sha1 @@ -0,0 +1 @@ +d1b6de7264205e2441c667dfee5b002bbac61644 \ No newline at end of file diff --git a/plugins/repository-azure/qa/microsoft-azure-storage/build.gradle b/plugins/repository-azure/qa/microsoft-azure-storage/build.gradle index 529ee899fc8c4..3e5cd7758d22c 100644 --- a/plugins/repository-azure/qa/microsoft-azure-storage/build.gradle +++ b/plugins/repository-azure/qa/microsoft-azure-storage/build.gradle @@ -29,6 +29,12 @@ apply plugin: 'elasticsearch.standalone-rest-test' apply plugin: 'elasticsearch.rest-test' apply plugin: 'elasticsearch.test.fixtures' +restResources { + restApi { + includeCore '_common', 'snapshot', 'bulk', 'count', 'indices' + } +} + testFixtures.useFixture ":test:fixtures:azure-fixture", "azure-fixture" boolean useFixture = false @@ -82,7 +88,5 @@ testClusters.integTest { // in a hacky way to change the protocol and endpoint. We must fix that. setting 'azure.client.integration_test.endpoint_suffix', { "ignored;DefaultEndpointsProtocol=http;BlobEndpoint=${-> azureAddress()}" }, IGNORE_VALUE - String firstPartOfSeed = BuildParams.testSeed.tokenize(':').get(0) - setting 'thread_pool.repository_azure.max', (Math.abs(Long.parseUnsignedLong(firstPartOfSeed, 16) % 10) + 1).toString(), System.getProperty('ignore.tests.seed') == null ? DEFAULT : IGNORE_VALUE } } diff --git a/plugins/repository-azure/src/main/java/org/elasticsearch/repositories/azure/AzureBlobContainer.java b/plugins/repository-azure/src/main/java/org/elasticsearch/repositories/azure/AzureBlobContainer.java index 2093139e115a3..6bd480f7923a9 100644 --- a/plugins/repository-azure/src/main/java/org/elasticsearch/repositories/azure/AzureBlobContainer.java +++ b/plugins/repository-azure/src/main/java/org/elasticsearch/repositories/azure/AzureBlobContainer.java @@ -23,17 +23,12 @@ import com.microsoft.azure.storage.StorageException; import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; -import org.elasticsearch.action.ActionListener; -import org.elasticsearch.action.ActionRunnable; -import org.elasticsearch.action.support.GroupedActionListener; -import org.elasticsearch.action.support.PlainActionFuture; import org.elasticsearch.common.Nullable; import org.elasticsearch.common.blobstore.BlobContainer; import org.elasticsearch.common.blobstore.BlobMetaData; import org.elasticsearch.common.blobstore.BlobPath; import org.elasticsearch.common.blobstore.DeleteResult; import org.elasticsearch.common.blobstore.support.AbstractBlobContainer; -import org.elasticsearch.threadpool.ThreadPool; import java.io.IOException; import java.io.InputStream; @@ -42,20 +37,18 @@ import java.nio.file.NoSuchFileException; import java.util.List; import java.util.Map; -import java.util.concurrent.ExecutorService; +import java.util.stream.Collectors; public class AzureBlobContainer extends AbstractBlobContainer { private final Logger logger = LogManager.getLogger(AzureBlobContainer.class); private final AzureBlobStore blobStore; - private final ThreadPool threadPool; private final String keyPath; - AzureBlobContainer(BlobPath path, AzureBlobStore blobStore, ThreadPool threadPool) { + AzureBlobContainer(BlobPath path, AzureBlobStore blobStore) { super(path); this.blobStore = blobStore; this.keyPath = path.buildAsString(); - this.threadPool = threadPool; } private boolean blobExists(String blobName) { @@ -112,7 +105,7 @@ public void writeBlobAtomic(String blobName, InputStream inputStream, long blobS @Override public DeleteResult delete() throws IOException { try { - return blobStore.deleteBlobDirectory(keyPath, threadPool.executor(AzureRepositoryPlugin.REPOSITORY_THREAD_POOL_NAME)); + return blobStore.deleteBlobDirectory(keyPath); } catch (URISyntaxException | StorageException e) { throw new IOException(e); } @@ -120,33 +113,9 @@ public DeleteResult delete() throws IOException { @Override public void deleteBlobsIgnoringIfNotExists(List blobNames) throws IOException { - final PlainActionFuture result = PlainActionFuture.newFuture(); - if (blobNames.isEmpty()) { - result.onResponse(null); - } else { - final GroupedActionListener listener = - new GroupedActionListener<>(ActionListener.map(result, v -> null), blobNames.size()); - final ExecutorService executor = threadPool.executor(AzureRepositoryPlugin.REPOSITORY_THREAD_POOL_NAME); - // Executing deletes in parallel since Azure SDK 8 is using blocking IO while Azure does not provide a bulk delete API endpoint - // TODO: Upgrade to newer non-blocking Azure SDK 11 and execute delete requests in parallel that way. - for (String blobName : blobNames) { - executor.execute(ActionRunnable.run(listener, () -> { - logger.trace("deleteBlob({})", blobName); - try { - blobStore.deleteBlob(buildKey(blobName)); - } catch (StorageException e) { - if (e.getHttpStatusCode() != HttpURLConnection.HTTP_NOT_FOUND) { - throw new IOException(e); - } - } catch (URISyntaxException e) { - throw new IOException(e); - } - })); - } - } try { - result.actionGet(); - } catch (Exception e) { + blobStore.deleteBlobsIgnoringIfNotExists(blobNames.stream().map(this::buildKey).collect(Collectors.toList())); + } catch (URISyntaxException | StorageException e) { throw new IOException("Exception during bulk delete", e); } } diff --git a/plugins/repository-azure/src/main/java/org/elasticsearch/repositories/azure/AzureBlobStore.java b/plugins/repository-azure/src/main/java/org/elasticsearch/repositories/azure/AzureBlobStore.java index 714e29edea29d..173f13d801ff3 100644 --- a/plugins/repository-azure/src/main/java/org/elasticsearch/repositories/azure/AzureBlobStore.java +++ b/plugins/repository-azure/src/main/java/org/elasticsearch/repositories/azure/AzureBlobStore.java @@ -28,14 +28,13 @@ import org.elasticsearch.common.blobstore.BlobStore; import org.elasticsearch.common.blobstore.DeleteResult; import org.elasticsearch.repositories.azure.AzureRepository.Repository; -import org.elasticsearch.threadpool.ThreadPool; import java.io.IOException; import java.io.InputStream; import java.net.URISyntaxException; +import java.util.Collection; import java.util.Collections; import java.util.Map; -import java.util.concurrent.Executor; import java.util.function.Function; import java.util.stream.Collectors; @@ -44,17 +43,15 @@ public class AzureBlobStore implements BlobStore { private final AzureStorageService service; - private final ThreadPool threadPool; private final String clientName; private final String container; private final LocationMode locationMode; - public AzureBlobStore(RepositoryMetaData metadata, AzureStorageService service, ThreadPool threadPool) { + public AzureBlobStore(RepositoryMetaData metadata, AzureStorageService service) { this.container = Repository.CONTAINER_SETTING.get(metadata.settings()); this.clientName = Repository.CLIENT_NAME.get(metadata.settings()); this.service = service; - this.threadPool = threadPool; // locationMode is set per repository, not per client this.locationMode = Repository.LOCATION_MODE_SETTING.get(metadata.settings()); final Map prevSettings = this.service.refreshAndClearCache(emptyMap()); @@ -80,7 +77,7 @@ public LocationMode getLocationMode() { @Override public BlobContainer blobContainer(BlobPath path) { - return new AzureBlobContainer(path, this, threadPool); + return new AzureBlobContainer(path, this); } @Override @@ -91,13 +88,12 @@ public boolean blobExists(String blob) throws URISyntaxException, StorageExcepti return service.blobExists(clientName, container, blob); } - public void deleteBlob(String blob) throws URISyntaxException, StorageException, IOException { - service.deleteBlob(clientName, container, blob); + public void deleteBlobsIgnoringIfNotExists(Collection blobs) throws URISyntaxException, StorageException { + service.deleteBlobsIgnoringIfNotExists(clientName, container, blobs); } - public DeleteResult deleteBlobDirectory(String path, Executor executor) - throws URISyntaxException, StorageException, IOException { - return service.deleteBlobDirectory(clientName, container, path, executor); + public DeleteResult deleteBlobDirectory(String path) throws URISyntaxException, StorageException, IOException { + return service.deleteBlobDirectory(clientName, container, path); } public InputStream getInputStream(String blob) throws URISyntaxException, StorageException, IOException { @@ -111,7 +107,7 @@ public Map listBlobsByPrefix(String keyPath, String prefix public Map children(BlobPath path) throws URISyntaxException, StorageException, IOException { return Collections.unmodifiableMap(service.children(clientName, container, path).stream().collect( - Collectors.toMap(Function.identity(), name -> new AzureBlobContainer(path.add(name), this, threadPool)))); + Collectors.toMap(Function.identity(), name -> new AzureBlobContainer(path.add(name), this)))); } public void writeBlob(String blobName, InputStream inputStream, long blobSize, boolean failIfAlreadyExists) diff --git a/plugins/repository-azure/src/main/java/org/elasticsearch/repositories/azure/AzureRepository.java b/plugins/repository-azure/src/main/java/org/elasticsearch/repositories/azure/AzureRepository.java index 7b7a9108c1ef5..a6a393bbb47c6 100644 --- a/plugins/repository-azure/src/main/java/org/elasticsearch/repositories/azure/AzureRepository.java +++ b/plugins/repository-azure/src/main/java/org/elasticsearch/repositories/azure/AzureRepository.java @@ -115,7 +115,7 @@ protected BlobStore getBlobStore() { @Override protected AzureBlobStore createBlobStore() { - final AzureBlobStore blobStore = new AzureBlobStore(metadata, storageService, threadPool); + final AzureBlobStore blobStore = new AzureBlobStore(metadata, storageService); logger.debug(() -> new ParameterizedMessage( "using container [{}], chunk_size [{}], compress [{}], base_path [{}]", diff --git a/plugins/repository-azure/src/main/java/org/elasticsearch/repositories/azure/AzureRepositoryPlugin.java b/plugins/repository-azure/src/main/java/org/elasticsearch/repositories/azure/AzureRepositoryPlugin.java index d98a7c3cbd717..ae1258a73b4f5 100644 --- a/plugins/repository-azure/src/main/java/org/elasticsearch/repositories/azure/AzureRepositoryPlugin.java +++ b/plugins/repository-azure/src/main/java/org/elasticsearch/repositories/azure/AzureRepositoryPlugin.java @@ -23,16 +23,12 @@ import org.elasticsearch.common.settings.Setting; import org.elasticsearch.common.settings.Settings; import org.elasticsearch.common.settings.SettingsException; -import org.elasticsearch.common.unit.TimeValue; import org.elasticsearch.common.xcontent.NamedXContentRegistry; import org.elasticsearch.env.Environment; import org.elasticsearch.plugins.Plugin; import org.elasticsearch.plugins.ReloadablePlugin; import org.elasticsearch.plugins.RepositoryPlugin; import org.elasticsearch.repositories.Repository; -import org.elasticsearch.threadpool.ExecutorBuilder; -import org.elasticsearch.threadpool.ScalingExecutorBuilder; - import java.util.Arrays; import java.util.Collections; import java.util.List; @@ -43,8 +39,6 @@ */ public class AzureRepositoryPlugin extends Plugin implements RepositoryPlugin, ReloadablePlugin { - public static final String REPOSITORY_THREAD_POOL_NAME = "repository_azure"; - // protected for testing final AzureStorageService azureStoreService; @@ -80,15 +74,6 @@ public List> getSettings() { ); } - @Override - public List> getExecutorBuilders(Settings settings) { - return Collections.singletonList(executorBuilder()); - } - - public static ExecutorBuilder executorBuilder() { - return new ScalingExecutorBuilder(REPOSITORY_THREAD_POOL_NAME, 0, 32, TimeValue.timeValueSeconds(30L)); - } - @Override public void reload(Settings settings) { // secure settings should be readable diff --git a/plugins/repository-azure/src/main/java/org/elasticsearch/repositories/azure/AzureStorageService.java b/plugins/repository-azure/src/main/java/org/elasticsearch/repositories/azure/AzureStorageService.java index 26ade5bdec624..c15af49578386 100644 --- a/plugins/repository-azure/src/main/java/org/elasticsearch/repositories/azure/AzureStorageService.java +++ b/plugins/repository-azure/src/main/java/org/elasticsearch/repositories/azure/AzureStorageService.java @@ -20,13 +20,16 @@ package org.elasticsearch.repositories.azure; import com.microsoft.azure.storage.AccessCondition; +import com.microsoft.azure.storage.BatchException; import com.microsoft.azure.storage.CloudStorageAccount; +import com.microsoft.azure.storage.Constants; import com.microsoft.azure.storage.OperationContext; import com.microsoft.azure.storage.RetryExponentialRetry; import com.microsoft.azure.storage.RetryPolicy; import com.microsoft.azure.storage.RetryPolicyFactory; import com.microsoft.azure.storage.StorageErrorCodeStrings; import com.microsoft.azure.storage.StorageException; +import com.microsoft.azure.storage.blob.BlobDeleteBatchOperation; import com.microsoft.azure.storage.blob.BlobInputStream; import com.microsoft.azure.storage.blob.BlobListingDetails; import com.microsoft.azure.storage.blob.BlobProperties; @@ -41,7 +44,6 @@ import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; import org.apache.logging.log4j.message.ParameterizedMessage; -import org.elasticsearch.action.support.PlainActionFuture; import org.elasticsearch.common.blobstore.BlobMetaData; import org.elasticsearch.common.blobstore.BlobPath; import org.elasticsearch.common.blobstore.DeleteResult; @@ -51,7 +53,6 @@ import org.elasticsearch.common.settings.SettingsException; import org.elasticsearch.common.unit.ByteSizeUnit; import org.elasticsearch.common.unit.ByteSizeValue; -import org.elasticsearch.common.util.concurrent.AbstractRunnable; import java.io.IOException; import java.io.InputStream; @@ -62,13 +63,13 @@ import java.security.InvalidKeyException; import java.util.ArrayList; import java.util.Collection; -import java.util.Collections; import java.util.EnumSet; import java.util.HashMap; import java.util.HashSet; +import java.util.Iterator; +import java.util.List; import java.util.Map; import java.util.Set; -import java.util.concurrent.Executor; import java.util.concurrent.atomic.AtomicLong; import java.util.function.Supplier; @@ -187,72 +188,61 @@ public boolean blobExists(String account, String container, String blob) throws }); } - public void deleteBlob(String account, String container, String blob) throws URISyntaxException, StorageException { + public void deleteBlobsIgnoringIfNotExists(String account, String container, Collection blobs) + throws URISyntaxException, StorageException { + logger.trace(() -> new ParameterizedMessage("delete blobs for container [{}], blob [{}]", container, blobs)); final Tuple> client = client(account); // Container name must be lower case. final CloudBlobContainer blobContainer = client.v1().getContainerReference(container); - logger.trace(() -> new ParameterizedMessage("delete blob for container [{}], blob [{}]", container, blob)); - SocketAccess.doPrivilegedVoidException(() -> { - final CloudBlockBlob azureBlob = blobContainer.getBlockBlobReference(blob); - logger.trace(() -> new ParameterizedMessage("container [{}]: blob [{}] found. removing.", container, blob)); - azureBlob.delete(DeleteSnapshotsOption.NONE, null, null, client.v2().get()); - }); + final Iterator blobIterator = blobs.iterator(); + int currentBatchSize = 0; + while (blobIterator.hasNext()) { + final BlobDeleteBatchOperation batchDeleteOp = new BlobDeleteBatchOperation(); + do { + batchDeleteOp.addSubOperation(blobContainer.getBlockBlobReference(blobIterator.next()), + DeleteSnapshotsOption.NONE, null, null); + ++currentBatchSize; + } while (blobIterator.hasNext() && currentBatchSize < Constants.BATCH_MAX_REQUESTS); + currentBatchSize = 0; + try { + SocketAccess.doPrivilegedVoidException(() -> blobContainer.getServiceClient().executeBatch(batchDeleteOp)); + } catch (BatchException e) { + for (StorageException ex : e.getExceptions().values()) { + if (ex.getHttpStatusCode() != HttpURLConnection.HTTP_NOT_FOUND) { + logger.error("Batch exceptions [{}]", e.getExceptions()); + throw e; + } + } + } + } } - DeleteResult deleteBlobDirectory(String account, String container, String path, Executor executor) + DeleteResult deleteBlobDirectory(String account, String container, String path) throws URISyntaxException, StorageException, IOException { final Tuple> client = client(account); final CloudBlobContainer blobContainer = client.v1().getContainerReference(container); - final Collection exceptions = Collections.synchronizedList(new ArrayList<>()); - final AtomicLong outstanding = new AtomicLong(1L); - final PlainActionFuture result = PlainActionFuture.newFuture(); final AtomicLong blobsDeleted = new AtomicLong(); final AtomicLong bytesDeleted = new AtomicLong(); + final List blobsToDelete = new ArrayList<>(); SocketAccess.doPrivilegedVoidException(() -> { - for (final ListBlobItem blobItem : blobContainer.listBlobs(path, true)) { + for (ListBlobItem blobItem : blobContainer.listBlobs(path, true)) { // uri.getPath is of the form /container/keyPath.* and we want to strip off the /container/ // this requires 1 + container.length() + 1, with each 1 corresponding to one of the / final String blobPath = blobItem.getUri().getPath().substring(1 + container.length() + 1); - outstanding.incrementAndGet(); - executor.execute(new AbstractRunnable() { - @Override - protected void doRun() throws Exception { - final long len; - if (blobItem instanceof CloudBlob) { - len = ((CloudBlob) blobItem).getProperties().getLength(); - } else { - len = -1L; - } - deleteBlob(account, container, blobPath); - blobsDeleted.incrementAndGet(); - if (len >= 0) { - bytesDeleted.addAndGet(len); - } - } - - @Override - public void onFailure(Exception e) { - exceptions.add(e); - } - - @Override - public void onAfter() { - if (outstanding.decrementAndGet() == 0) { - result.onResponse(null); - } - } - }); + final long len; + if (blobItem instanceof CloudBlob) { + len = ((CloudBlob) blobItem).getProperties().getLength(); + } else { + len = -1L; + } + blobsToDelete.add(blobPath); + blobsDeleted.incrementAndGet(); + if (len >= 0) { + bytesDeleted.addAndGet(len); + } } }); - if (outstanding.decrementAndGet() == 0) { - result.onResponse(null); - } - result.actionGet(); - if (exceptions.isEmpty() == false) { - final IOException ex = new IOException("Deleting directory [" + path + "] failed"); - exceptions.forEach(ex::addSuppressed); - throw ex; - } + deleteBlobsIgnoringIfNotExists(account, container, blobsToDelete); return new DeleteResult(blobsDeleted.get(), bytesDeleted.get()); } diff --git a/plugins/repository-azure/src/main/plugin-metadata/plugin-security.policy b/plugins/repository-azure/src/main/plugin-metadata/plugin-security.policy index 19a35f8405903..0e2572a63156d 100644 --- a/plugins/repository-azure/src/main/plugin-metadata/plugin-security.policy +++ b/plugins/repository-azure/src/main/plugin-metadata/plugin-security.policy @@ -20,4 +20,5 @@ grant { // azure client opens socket connections for to access repository permission java.net.SocketPermission "*", "connect"; + permission java.lang.RuntimePermission "setFactory"; }; diff --git a/plugins/repository-azure/src/test/java/org/elasticsearch/repositories/azure/AzureBlobContainerRetriesTests.java b/plugins/repository-azure/src/test/java/org/elasticsearch/repositories/azure/AzureBlobContainerRetriesTests.java index ce3cba065c35b..b8ec6115b4d17 100644 --- a/plugins/repository-azure/src/test/java/org/elasticsearch/repositories/azure/AzureBlobContainerRetriesTests.java +++ b/plugins/repository-azure/src/test/java/org/elasticsearch/repositories/azure/AzureBlobContainerRetriesTests.java @@ -44,8 +44,6 @@ import org.elasticsearch.rest.RestStatus; import org.elasticsearch.rest.RestUtils; import org.elasticsearch.test.ESTestCase; -import org.elasticsearch.threadpool.TestThreadPool; -import org.elasticsearch.threadpool.ThreadPool; import org.junit.After; import org.junit.Before; @@ -64,7 +62,6 @@ import java.util.Map; import java.util.Objects; import java.util.concurrent.ConcurrentHashMap; -import java.util.concurrent.TimeUnit; import java.util.concurrent.atomic.AtomicBoolean; import java.util.concurrent.atomic.AtomicInteger; import java.util.regex.Matcher; @@ -91,11 +88,9 @@ public class AzureBlobContainerRetriesTests extends ESTestCase { private HttpServer httpServer; - private ThreadPool threadPool; @Before public void setUp() throws Exception { - threadPool = new TestThreadPool(getTestClass().getName(), AzureRepositoryPlugin.executorBuilder()); httpServer = MockHttpServer.createHttp(new InetSocketAddress(InetAddress.getLoopbackAddress(), 0), 0); httpServer.start(); super.setUp(); @@ -105,7 +100,6 @@ public void setUp() throws Exception { public void tearDown() throws Exception { httpServer.stop(0); super.tearDown(); - ThreadPool.terminate(threadPool, 10L, TimeUnit.SECONDS); } private BlobContainer createBlobContainer(final int maxRetries) { @@ -145,7 +139,7 @@ BlobRequestOptions getBlobRequestOptionsForWriteBlob() { .put(ACCOUNT_SETTING.getKey(), clientName) .build()); - return new AzureBlobContainer(BlobPath.cleanPath(), new AzureBlobStore(repositoryMetaData, service, threadPool), threadPool); + return new AzureBlobContainer(BlobPath.cleanPath(), new AzureBlobStore(repositoryMetaData, service)); } public void testReadNonexistentBlobThrowsNoSuchFileException() { diff --git a/plugins/repository-azure/src/test/java/org/elasticsearch/repositories/azure/AzureBlobStoreRepositoryTests.java b/plugins/repository-azure/src/test/java/org/elasticsearch/repositories/azure/AzureBlobStoreRepositoryTests.java index b23693fd268d4..47703b90e43cc 100644 --- a/plugins/repository-azure/src/test/java/org/elasticsearch/repositories/azure/AzureBlobStoreRepositoryTests.java +++ b/plugins/repository-azure/src/test/java/org/elasticsearch/repositories/azure/AzureBlobStoreRepositoryTests.java @@ -64,7 +64,7 @@ protected Collection> nodePlugins() { @Override protected Map createHttpHandlers() { - return Collections.singletonMap("/container", new AzureBlobStoreHttpHandler("container")); + return Collections.singletonMap("/", new AzureBlobStoreHttpHandler("container")); } @Override diff --git a/plugins/repository-gcs/build.gradle b/plugins/repository-gcs/build.gradle index bd20c36922297..9af1ae73327ef 100644 --- a/plugins/repository-gcs/build.gradle +++ b/plugins/repository-gcs/build.gradle @@ -56,6 +56,12 @@ dependencies { testCompile project(':test:fixtures:gcs-fixture') } +restResources { + restApi { + includeCore '_common', 'cluster', 'nodes' + } +} + dependencyLicenses { mapping from: /google-cloud-.*/, to: 'google-cloud' mapping from: /google-auth-.*/, to: 'google-auth' diff --git a/plugins/repository-gcs/qa/google-cloud-storage/build.gradle b/plugins/repository-gcs/qa/google-cloud-storage/build.gradle index aaa5a4ddc553b..10105d2f2348c 100644 --- a/plugins/repository-gcs/qa/google-cloud-storage/build.gradle +++ b/plugins/repository-gcs/qa/google-cloud-storage/build.gradle @@ -36,6 +36,12 @@ dependencies { testCompile project(path: ':plugins:repository-gcs') } +restResources { + restApi { + includeCore '_common', 'snapshot','indices', 'index', 'bulk', 'count' + } +} + testFixtures.useFixture(':test:fixtures:gcs-fixture') boolean useFixture = false diff --git a/plugins/repository-hdfs/build.gradle b/plugins/repository-hdfs/build.gradle index 83e3735300a0f..bf570b8e0abb6 100644 --- a/plugins/repository-hdfs/build.gradle +++ b/plugins/repository-hdfs/build.gradle @@ -74,6 +74,12 @@ dependencies { } } +restResources { + restApi { + includeCore '_common', 'cluster', 'nodes', 'indices', 'index', 'snapshot' + } +} + normalization { runtimeClasspath { // ignore generated keytab files for the purposes of build avoidance diff --git a/plugins/repository-s3/build.gradle b/plugins/repository-s3/build.gradle index a92f07b71ee01..a7c8d49a50728 100644 --- a/plugins/repository-s3/build.gradle +++ b/plugins/repository-s3/build.gradle @@ -30,7 +30,7 @@ esplugin { } versions << [ - 'aws': '1.11.636' + 'aws': '1.11.749' ] dependencies { @@ -43,7 +43,7 @@ dependencies { compile "org.apache.logging.log4j:log4j-1.2-api:${versions.log4j}" compile "commons-codec:commons-codec:${versions.commonscodec}" compile "com.fasterxml.jackson.core:jackson-core:${versions.jackson}" - compile "com.fasterxml.jackson.core:jackson-databind:${versions.jacksondatabind}" + compile "com.fasterxml.jackson.core:jackson-databind:${versions.jackson}" compile "com.fasterxml.jackson.core:jackson-annotations:${versions.jackson}" compile "com.fasterxml.jackson.dataformat:jackson-dataformat-cbor:${versions.jackson}" compile "joda-time:joda-time:${versions.joda}" diff --git a/plugins/repository-s3/licenses/aws-java-sdk-core-1.11.636.jar.sha1 b/plugins/repository-s3/licenses/aws-java-sdk-core-1.11.636.jar.sha1 deleted file mode 100644 index b9ee9c102dbcb..0000000000000 --- a/plugins/repository-s3/licenses/aws-java-sdk-core-1.11.636.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -84c9f180f8f60f6f1433c9c5253fcb704593b121 \ No newline at end of file diff --git a/plugins/repository-s3/licenses/aws-java-sdk-core-1.11.749.jar.sha1 b/plugins/repository-s3/licenses/aws-java-sdk-core-1.11.749.jar.sha1 new file mode 100644 index 0000000000000..7bc18d6d4f681 --- /dev/null +++ b/plugins/repository-s3/licenses/aws-java-sdk-core-1.11.749.jar.sha1 @@ -0,0 +1 @@ +1da5c1549295cfeebc67fc1c7539785a9441755b \ No newline at end of file diff --git a/plugins/repository-s3/licenses/aws-java-sdk-s3-1.11.636.jar.sha1 b/plugins/repository-s3/licenses/aws-java-sdk-s3-1.11.636.jar.sha1 deleted file mode 100644 index 1e05e98d240d2..0000000000000 --- a/plugins/repository-s3/licenses/aws-java-sdk-s3-1.11.636.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -f86fc1993ac8122f6f02a8eb9b467b5f945cd76b \ No newline at end of file diff --git a/plugins/repository-s3/licenses/aws-java-sdk-s3-1.11.749.jar.sha1 b/plugins/repository-s3/licenses/aws-java-sdk-s3-1.11.749.jar.sha1 new file mode 100644 index 0000000000000..af794dc59dd7f --- /dev/null +++ b/plugins/repository-s3/licenses/aws-java-sdk-s3-1.11.749.jar.sha1 @@ -0,0 +1 @@ +7d069f82723907ccdbd0c91ef0ac76046f5c9652 \ No newline at end of file diff --git a/plugins/repository-s3/licenses/jackson-annotations-2.10.3.jar.sha1 b/plugins/repository-s3/licenses/jackson-annotations-2.10.3.jar.sha1 new file mode 100644 index 0000000000000..9c725f2d90e69 --- /dev/null +++ b/plugins/repository-s3/licenses/jackson-annotations-2.10.3.jar.sha1 @@ -0,0 +1 @@ +0f63b3b1da563767d04d2e4d3fc1ae0cdeffebe7 \ No newline at end of file diff --git a/plugins/repository-s3/licenses/jackson-annotations-2.8.11.jar.sha1 b/plugins/repository-s3/licenses/jackson-annotations-2.8.11.jar.sha1 deleted file mode 100644 index 30e7d1a7b1a74..0000000000000 --- a/plugins/repository-s3/licenses/jackson-annotations-2.8.11.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -391de20b4e29cb3fb07d2454ace64be2c82ac91f \ No newline at end of file diff --git a/plugins/repository-s3/licenses/jackson-databind-2.10.3.jar.sha1 b/plugins/repository-s3/licenses/jackson-databind-2.10.3.jar.sha1 new file mode 100644 index 0000000000000..688ae92d10792 --- /dev/null +++ b/plugins/repository-s3/licenses/jackson-databind-2.10.3.jar.sha1 @@ -0,0 +1 @@ +aae92628b5447fa25af79871ca98668da6edd439 \ No newline at end of file diff --git a/plugins/repository-s3/licenses/jackson-databind-2.8.11.6.jar.sha1 b/plugins/repository-s3/licenses/jackson-databind-2.8.11.6.jar.sha1 deleted file mode 100644 index f491259db56bc..0000000000000 --- a/plugins/repository-s3/licenses/jackson-databind-2.8.11.6.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -35753201d0cdb1dbe998ab289bca1180b68d4368 \ No newline at end of file diff --git a/plugins/repository-s3/licenses/jmespath-java-1.11.636.jar.sha1 b/plugins/repository-s3/licenses/jmespath-java-1.11.636.jar.sha1 deleted file mode 100644 index 70c0d3633af07..0000000000000 --- a/plugins/repository-s3/licenses/jmespath-java-1.11.636.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -e468c349ce410171a1d5df7fa0fa377d52c5d651 \ No newline at end of file diff --git a/plugins/repository-s3/licenses/jmespath-java-1.11.749.jar.sha1 b/plugins/repository-s3/licenses/jmespath-java-1.11.749.jar.sha1 new file mode 100644 index 0000000000000..3467802d074c7 --- /dev/null +++ b/plugins/repository-s3/licenses/jmespath-java-1.11.749.jar.sha1 @@ -0,0 +1 @@ +778866bc557dba508ee0eab2a0c5bfde468e49e6 \ No newline at end of file diff --git a/plugins/repository-s3/src/main/java/org/elasticsearch/repositories/s3/S3Service.java b/plugins/repository-s3/src/main/java/org/elasticsearch/repositories/s3/S3Service.java index 77117550dc0d9..73149bf92073a 100644 --- a/plugins/repository-s3/src/main/java/org/elasticsearch/repositories/s3/S3Service.java +++ b/plugins/repository-s3/src/main/java/org/elasticsearch/repositories/s3/S3Service.java @@ -141,7 +141,12 @@ AmazonS3 buildClient(final S3ClientSettings clientSettings) { builder.withCredentials(buildCredentials(logger, clientSettings)); builder.withClientConfiguration(buildConfiguration(clientSettings)); - final String endpoint = Strings.hasLength(clientSettings.endpoint) ? clientSettings.endpoint : Constants.S3_HOSTNAME; + String endpoint = Strings.hasLength(clientSettings.endpoint) ? clientSettings.endpoint : Constants.S3_HOSTNAME; + if ((endpoint.startsWith("http://") || endpoint.startsWith("https://")) == false) { + // Manually add the schema to the endpoint to work around https://github.com/aws/aws-sdk-java/issues/2274 + // TODO: Remove this once fixed in the AWS SDK + endpoint = clientSettings.protocol.toString() + "://" + endpoint; + } final String region = Strings.hasLength(clientSettings.region) ? clientSettings.region : null; logger.debug("using endpoint [{}] and region [{}]", endpoint, region); @@ -160,7 +165,7 @@ AmazonS3 buildClient(final S3ClientSettings clientSettings) { if (clientSettings.disableChunkedEncoding) { builder.disableChunkedEncoding(); } - return builder.build(); + return SocketAccess.doPrivileged(builder::build); } // pkg private for tests diff --git a/plugins/repository-s3/src/test/java/org/elasticsearch/repositories/s3/S3BlobContainerRetriesTests.java b/plugins/repository-s3/src/test/java/org/elasticsearch/repositories/s3/S3BlobContainerRetriesTests.java index 7060082ffcdfa..a8affd31aa857 100644 --- a/plugins/repository-s3/src/test/java/org/elasticsearch/repositories/s3/S3BlobContainerRetriesTests.java +++ b/plugins/repository-s3/src/test/java/org/elasticsearch/repositories/s3/S3BlobContainerRetriesTests.java @@ -170,7 +170,7 @@ public void testReadBlobWithRetries() throws Exception { } }); - final TimeValue readTimeout = TimeValue.timeValueMillis(between(100, 500)); + final TimeValue readTimeout = TimeValue.timeValueSeconds(between(1, 3)); final BlobContainer blobContainer = createBlobContainer(maxRetries, readTimeout, null, null); try (InputStream inputStream = blobContainer.readBlob("read_blob_max_retries")) { assertArrayEquals(bytes, BytesReference.toBytes(Streams.readFully(inputStream))); @@ -397,13 +397,15 @@ private static byte[] randomBlobContent() { return randomByteArrayOfLength(randomIntBetween(1, frequently() ? 512 : 1 << 20)); // rarely up to 1mb } + private static final Pattern RANGE_PATTERN = Pattern.compile("^bytes=([0-9]+)-9223372036854775806$"); + private static int getRangeStart(HttpExchange exchange) { final String rangeHeader = exchange.getRequestHeaders().getFirst("Range"); if (rangeHeader == null) { return 0; } - final Matcher matcher = Pattern.compile("^bytes=([0-9]+)-9223372036854775806$").matcher(rangeHeader); + final Matcher matcher = RANGE_PATTERN.matcher(rangeHeader); assertTrue(rangeHeader + " matches expected pattern", matcher.matches()); return Math.toIntExact(Long.parseLong(matcher.group(1))); } diff --git a/plugins/store-smb/build.gradle b/plugins/store-smb/build.gradle index f2238e0a49c90..87aad3e59769b 100644 --- a/plugins/store-smb/build.gradle +++ b/plugins/store-smb/build.gradle @@ -21,4 +21,8 @@ esplugin { description 'The Store SMB plugin adds support for SMB stores.' classname 'org.elasticsearch.plugin.store.smb.SMBStorePlugin' } - +restResources { + restApi { + includeCore '_common', 'cluster', 'nodes', 'index', 'indices', 'get' + } +} diff --git a/qa/evil-tests/src/test/java/org/elasticsearch/common/logging/EvilLoggerTests.java b/qa/evil-tests/src/test/java/org/elasticsearch/common/logging/EvilLoggerTests.java index e32447c47b092..2e35ac9581f9a 100644 --- a/qa/evil-tests/src/test/java/org/elasticsearch/common/logging/EvilLoggerTests.java +++ b/qa/evil-tests/src/test/java/org/elasticsearch/common/logging/EvilLoggerTests.java @@ -164,7 +164,8 @@ public void testConcurrentDeprecationLogger() throws IOException, UserException, */ final List warnings = threadContext.getResponseHeaders().get("Warning"); final Set actualWarningValues = - warnings.stream().map(DeprecationLogger::extractWarningValueFromWarningHeader).collect(Collectors.toSet()); + warnings.stream().map(s -> DeprecationLogger.extractWarningValueFromWarningHeader(s, true)) + .collect(Collectors.toSet()); for (int j = 0; j < 128; j++) { assertThat( actualWarningValues, diff --git a/qa/multi-cluster-search/src/test/java/org/elasticsearch/search/CCSDuelIT.java b/qa/multi-cluster-search/src/test/java/org/elasticsearch/search/CCSDuelIT.java index 6bf3bc643b950..0c16e539fcaaf 100644 --- a/qa/multi-cluster-search/src/test/java/org/elasticsearch/search/CCSDuelIT.java +++ b/qa/multi-cluster-search/src/test/java/org/elasticsearch/search/CCSDuelIT.java @@ -124,7 +124,7 @@ public class CCSDuelIT extends ESRestTestCase { private static final String INDEX_NAME = "ccs_duel_index"; private static final String REMOTE_INDEX_NAME = "my_remote_cluster:" + INDEX_NAME; - private static final String[] TAGS = new String[]{"java", "xml", "sql", "html", "php", "ruby", "python", "perl"}; + private static final String[] TAGS = new String[] {"java", "xml", "sql", "html", "php", "ruby", "python", "perl"}; private static RestHighLevelClient restHighLevelClient; @@ -435,6 +435,8 @@ public void testSortByField() throws Exception { public void testSortByFieldOneClusterHasNoResults() throws Exception { assumeMultiClusterSetup(); SearchRequest searchRequest = initSearchRequest(); + // set to a value greater than the number of shards to avoid differences due to the skipping of shards + searchRequest.setPreFilterShardSize(128); SearchSourceBuilder sourceBuilder = new SearchSourceBuilder(); boolean onlyRemote = randomBoolean(); sourceBuilder.query(new TermQueryBuilder("_index", onlyRemote ? REMOTE_INDEX_NAME : INDEX_NAME)); diff --git a/qa/smoke-test-http/src/test/java/org/elasticsearch/http/DeprecationHttpIT.java b/qa/smoke-test-http/src/test/java/org/elasticsearch/http/DeprecationHttpIT.java index fc686c5671997..c9bed57fb13e7 100644 --- a/qa/smoke-test-http/src/test/java/org/elasticsearch/http/DeprecationHttpIT.java +++ b/qa/smoke-test-http/src/test/java/org/elasticsearch/http/DeprecationHttpIT.java @@ -187,7 +187,8 @@ private void doTestDeprecationWarningsAppearInHeaders() throws IOException { assertThat(deprecatedWarning, matches(WARNING_HEADER_PATTERN.pattern())); } final List actualWarningValues = - deprecatedWarnings.stream().map(DeprecationLogger::extractWarningValueFromWarningHeader).collect(Collectors.toList()); + deprecatedWarnings.stream().map(s -> DeprecationLogger.extractWarningValueFromWarningHeader(s, true)) + .collect(Collectors.toList()); for (Matcher headerMatcher : headerMatchers) { assertThat(actualWarningValues, hasItem(headerMatcher)); } diff --git a/rest-api-spec/README.markdown b/rest-api-spec/README.markdown index 15b3c14941909..e2c3d37f11da3 100644 --- a/rest-api-spec/README.markdown +++ b/rest-api-spec/README.markdown @@ -43,7 +43,12 @@ Example for the ["Create Index"](https://www.elastic.co/guide/en/elasticsearch/r The specification contains: * The _name_ of the API (`indices.create`), which usually corresponds to the client calls -* Link to the documentation at the website +* Link to the documentation at the website. + + **IMPORANT:** This should be a _live_ link. Several downstream ES clients use + this link to generate their documentation. Using a broken link or linking to + yet-to-be-created doc pages can break the [Elastic docs + build](https://github.com/elastic/docs#building-documentation). * `stability` indicating the state of the API, has to be declared explicitly or YAML tests will fail * `experimental` highly likely to break in the near future (minor/patch), no bwc guarantees. Possibly removed in the future. diff --git a/rest-api-spec/build.gradle b/rest-api-spec/build.gradle index fa29345e0ff6b..8902c57211ef2 100644 --- a/rest-api-spec/build.gradle +++ b/rest-api-spec/build.gradle @@ -1,15 +1,11 @@ apply plugin: 'elasticsearch.build' apply plugin: 'nebula.maven-base-publish' apply plugin: 'nebula.maven-scm' +apply plugin: 'elasticsearch.rest-resources' test.enabled = false jarHell.enabled = false -configurations { - restSpecs - restTests -} - artifacts { restSpecs(new File(projectDir, "src/main/resources/rest-api-spec/api")) restTests(new File(projectDir, "src/main/resources/rest-api-spec/test")) diff --git a/rest-api-spec/src/main/resources/rest-api-spec/api/cluster.delete_component_template.json b/rest-api-spec/src/main/resources/rest-api-spec/api/cluster.delete_component_template.json new file mode 100644 index 0000000000000..6ddfe6c7ead5f --- /dev/null +++ b/rest-api-spec/src/main/resources/rest-api-spec/api/cluster.delete_component_template.json @@ -0,0 +1,35 @@ +{ + "cluster.delete_component_template":{ + "documentation":{ + "url":"https://www.elastic.co/guide/en/elasticsearch/reference/master/indices-component-templates.html", + "description":"Deletes a component template" + }, + "stability":"stable", + "url":{ + "paths":[ + { + "path":"/_component_template/{name}", + "methods":[ + "DELETE" + ], + "parts":{ + "name":{ + "type":"string", + "description":"The name of the template" + } + } + } + ] + }, + "params":{ + "timeout":{ + "type":"time", + "description":"Explicit operation timeout" + }, + "master_timeout":{ + "type":"time", + "description":"Specify timeout for connection to master" + } + } + } +} diff --git a/rest-api-spec/src/main/resources/rest-api-spec/api/cluster.get_component_template.json b/rest-api-spec/src/main/resources/rest-api-spec/api/cluster.get_component_template.json new file mode 100644 index 0000000000000..27bd093e620f0 --- /dev/null +++ b/rest-api-spec/src/main/resources/rest-api-spec/api/cluster.get_component_template.json @@ -0,0 +1,41 @@ +{ + "cluster.get_component_template":{ + "documentation":{ + "url":"https://www.elastic.co/guide/en/elasticsearch/reference/master/indices-component-templates.html", + "description":"Returns one or more component templates" + }, + "stability":"stable", + "url":{ + "paths":[ + { + "path":"/_component_template", + "methods":[ + "GET" + ] + }, + { + "path":"/_component_template/{name}", + "methods":[ + "GET" + ], + "parts":{ + "name":{ + "type":"list", + "description":"The comma separated names of the component templates" + } + } + } + ] + }, + "params":{ + "master_timeout":{ + "type":"time", + "description":"Explicit operation timeout for connection to master node" + }, + "local":{ + "type":"boolean", + "description":"Return local information, do not retrieve the state from master node (default: false)" + } + } + } +} diff --git a/rest-api-spec/src/main/resources/rest-api-spec/api/cluster.put_component_template.json b/rest-api-spec/src/main/resources/rest-api-spec/api/cluster.put_component_template.json new file mode 100644 index 0000000000000..3bdac2357d023 --- /dev/null +++ b/rest-api-spec/src/main/resources/rest-api-spec/api/cluster.put_component_template.json @@ -0,0 +1,45 @@ +{ + "cluster.put_component_template":{ + "documentation":{ + "url":"https://www.elastic.co/guide/en/elasticsearch/reference/master/indices-component-templates.html", + "description":"Creates or updates a component template" + }, + "stability":"stable", + "url":{ + "paths":[ + { + "path":"/_component_template/{name}", + "methods":[ + "PUT", + "POST" + ], + "parts":{ + "name":{ + "type":"string", + "description":"The name of the template" + } + } + } + ] + }, + "params":{ + "create":{ + "type":"boolean", + "description":"Whether the index template should only be added if new or can also replace an existing one", + "default":false + }, + "timeout":{ + "type":"time", + "description":"Explicit operation timeout" + }, + "master_timeout":{ + "type":"time", + "description":"Specify timeout for connection to master" + } + }, + "body":{ + "description":"The template definition", + "required":true + } + } +} diff --git a/rest-api-spec/src/main/resources/rest-api-spec/api/indices.create_data_stream.json b/rest-api-spec/src/main/resources/rest-api-spec/api/indices.create_data_stream.json new file mode 100644 index 0000000000000..ef8615a69b1ca --- /dev/null +++ b/rest-api-spec/src/main/resources/rest-api-spec/api/indices.create_data_stream.json @@ -0,0 +1,31 @@ +{ + "indices.create_data_stream":{ + "documentation":{ + "url":"https://www.elastic.co/guide/en/elasticsearch/reference/master/data-streams.html", + "description":"Creates or updates a data stream" + }, + "stability":"experimental", + "url":{ + "paths":[ + { + "path":"/_data_stream/{name}", + "methods":[ + "PUT" + ], + "parts":{ + "name":{ + "type":"string", + "description":"The name of the data stream" + } + } + } + ] + }, + "params":{ + }, + "body":{ + "description":"The data stream definition", + "required":true + } + } +} diff --git a/rest-api-spec/src/main/resources/rest-api-spec/api/indices.delete_data_stream.json b/rest-api-spec/src/main/resources/rest-api-spec/api/indices.delete_data_stream.json new file mode 100644 index 0000000000000..71ed5808caefc --- /dev/null +++ b/rest-api-spec/src/main/resources/rest-api-spec/api/indices.delete_data_stream.json @@ -0,0 +1,26 @@ +{ + "indices.delete_data_stream":{ + "documentation":{ + "url":"https://www.elastic.co/guide/en/elasticsearch/reference/master/data-streams.html", + "description":"Deletes a data stream." + }, + "stability":"experimental", + "url":{ + "paths":[ + { + "path":"/_data_stream/{name}", + "methods":[ + "DELETE" + ], + "parts":{ + "name":{ + "type":"string", + "description":"The name of the data stream" + } + } + } + ] + }, + "params":{} + } +} diff --git a/rest-api-spec/src/main/resources/rest-api-spec/api/indices.get_data_streams.json b/rest-api-spec/src/main/resources/rest-api-spec/api/indices.get_data_streams.json new file mode 100644 index 0000000000000..42415068d4a5d --- /dev/null +++ b/rest-api-spec/src/main/resources/rest-api-spec/api/indices.get_data_streams.json @@ -0,0 +1,33 @@ +{ + "indices.get_data_streams":{ + "documentation":{ + "url":"https://www.elastic.co/guide/en/elasticsearch/reference/master/data-streams.html", + "description":"Returns data streams." + }, + "stability":"experimental", + "url":{ + "paths":[ + { + "path":"/_data_streams", + "methods":[ + "GET" + ] + }, + { + "path":"/_data_streams/{name}", + "methods":[ + "GET" + ], + "parts":{ + "name":{ + "type":"list", + "description":"The comma separated names of data streams" + } + } + } + ] + }, + "params":{ + } + } +} diff --git a/rest-api-spec/src/main/resources/rest-api-spec/api/indices.put_template.json b/rest-api-spec/src/main/resources/rest-api-spec/api/indices.put_template.json index 133776199b3ac..75a328af929ef 100644 --- a/rest-api-spec/src/main/resources/rest-api-spec/api/indices.put_template.json +++ b/rest-api-spec/src/main/resources/rest-api-spec/api/indices.put_template.json @@ -32,17 +32,9 @@ "description":"Whether the index template should only be added if new or can also replace an existing one", "default":false }, - "timeout":{ - "type":"time", - "description":"Explicit operation timeout" - }, "master_timeout":{ "type":"time", "description":"Specify timeout for connection to master" - }, - "flat_settings":{ - "type":"boolean", - "description":"Return settings in flat format (default: false)" } }, "body":{ diff --git a/rest-api-spec/src/main/resources/rest-api-spec/api/msearch.json b/rest-api-spec/src/main/resources/rest-api-spec/api/msearch.json index 8dd9da844520a..968ccfd8e718f 100644 --- a/rest-api-spec/src/main/resources/rest-api-spec/api/msearch.json +++ b/rest-api-spec/src/main/resources/rest-api-spec/api/msearch.json @@ -50,8 +50,7 @@ }, "pre_filter_shard_size":{ "type":"number", - "description":"A threshold that enforces a pre-filter roundtrip to prefilter search shards based on query rewriting if the number of shards the search request expands to exceeds the threshold. This filter roundtrip can limit the number of shards significantly if for instance a shard can not match any documents based on it's rewrite method ie. if date filters are mandatory to match but the shard bounds and the query are disjoint.", - "default":128 + "description":"A threshold that enforces a pre-filter roundtrip to prefilter search shards based on query rewriting if the number of shards the search request expands to exceeds the threshold. This filter roundtrip can limit the number of shards significantly if for instance a shard can not match any documents based on its rewrite method ie. if date filters are mandatory to match but the shard bounds and the query are disjoint." }, "max_concurrent_shard_requests":{ "type":"number", diff --git a/rest-api-spec/src/main/resources/rest-api-spec/api/search.json b/rest-api-spec/src/main/resources/rest-api-spec/api/search.json index 1e0c232efa055..ac321acf8907b 100644 --- a/rest-api-spec/src/main/resources/rest-api-spec/api/search.json +++ b/rest-api-spec/src/main/resources/rest-api-spec/api/search.json @@ -219,8 +219,7 @@ }, "pre_filter_shard_size":{ "type":"number", - "description":"A threshold that enforces a pre-filter roundtrip to prefilter search shards based on query rewriting if the number of shards the search request expands to exceeds the threshold. This filter roundtrip can limit the number of shards significantly if for instance a shard can not match any documents based on it's rewrite method ie. if date filters are mandatory to match but the shard bounds and the query are disjoint.", - "default":128 + "description":"A threshold that enforces a pre-filter roundtrip to prefilter search shards based on query rewriting if the number of shards the search request expands to exceeds the threshold. This filter roundtrip can limit the number of shards significantly if for instance a shard can not match any documents based on its rewrite method ie. if date filters are mandatory to match but the shard bounds and the query are disjoint." }, "rest_total_hits_as_int":{ "type":"boolean", diff --git a/rest-api-spec/src/main/resources/rest-api-spec/test/cat.aliases/40_hidden.yml b/rest-api-spec/src/main/resources/rest-api-spec/test/cat.aliases/40_hidden.yml index 6866ff595fefd..3aa7fdbb1f760 100644 --- a/rest-api-spec/src/main/resources/rest-api-spec/test/cat.aliases/40_hidden.yml +++ b/rest-api-spec/src/main/resources/rest-api-spec/test/cat.aliases/40_hidden.yml @@ -1,9 +1,8 @@ --- "Test cat aliases output with a hidden index with a hidden alias": - skip: - version: "- 7.99.99" + version: "- 7.6.99" reason: "hidden indices and aliases were added in 7.7.0" - # TODO: Update this in/after backport of https://github.com/elastic/elasticsearch/pull/53248 - do: indices.create: @@ -59,9 +58,8 @@ --- "Test cat aliases output with a hidden index with a visible alias": - skip: - version: "- 7.99.99" + version: "- 7.6.99" reason: "hidden indices and aliases were added in 7.7.0" - # TODO: Update this in/after backport of https://github.com/elastic/elasticsearch/pull/53248 - do: indices.create: @@ -106,9 +104,8 @@ --- "Test cat aliases output with a visible index with a hidden alias": - skip: - version: "- 7.99.99" + version: "- 7.6.99" reason: "hidden indices and aliases were added in 7.7.0" - # TODO: Update this in/after backport of https://github.com/elastic/elasticsearch/pull/53248 - do: indices.create: diff --git a/rest-api-spec/src/main/resources/rest-api-spec/test/cat.indices/20_hidden.yml b/rest-api-spec/src/main/resources/rest-api-spec/test/cat.indices/20_hidden.yml index 3a4fe28c85996..f357f362fd8eb 100644 --- a/rest-api-spec/src/main/resources/rest-api-spec/test/cat.indices/20_hidden.yml +++ b/rest-api-spec/src/main/resources/rest-api-spec/test/cat.indices/20_hidden.yml @@ -1,9 +1,8 @@ --- "Test cat indices output for hidden index": - skip: - version: "- 7.99.99" + version: "- 7.6.99" reason: "hidden indices were added in 7.7.0" - # TODO: Update this in/after backport of https://github.com/elastic/elasticsearch/pull/53248 - do: indices.create: index: index1 @@ -40,9 +39,8 @@ --- "Test cat indices output for dot-hidden index and dot-prefixed pattern": - skip: - version: "- 7.99.99" + version: "- 7.6.99" reason: "hidden indices were added in 7.7.0" - # TODO: Update this in/after backport of https://github.com/elastic/elasticsearch/pull/53248 - do: indices.create: index: .index1 @@ -79,9 +77,8 @@ --- "Test cat indices output with a hidden index with a visible alias": - skip: - version: "- 7.99.99" + version: "- 7.6.99" reason: "hidden indices were added in 7.7.0" - # TODO: Update this in/after backport of https://github.com/elastic/elasticsearch/pull/53248 - do: indices.create: @@ -142,9 +139,8 @@ --- "Test cat indices output with a hidden index with a hidden alias": - skip: - version: "- 7.99.99" + version: "- 7.6.99" reason: "hidden indices and aliases were added in 7.7.0" - # TODO: Update this in/after backport of https://github.com/elastic/elasticsearch/pull/53248 - do: indices.create: @@ -203,9 +199,8 @@ --- "Test cat indices output with a hidden index, dot-hidden alias and dot pattern": - skip: - version: "- 7.99.99" + version: "- 7.6.99" reason: "hidden indices and aliases were added in 7.7.0" - # TODO: Update this in/after backport of https://github.com/elastic/elasticsearch/pull/53248 - do: indices.create: diff --git a/rest-api-spec/src/main/resources/rest-api-spec/test/cat.shards/10_basic.yml b/rest-api-spec/src/main/resources/rest-api-spec/test/cat.shards/10_basic.yml index de5e632975752..aa4abc7a11eae 100644 --- a/rest-api-spec/src/main/resources/rest-api-spec/test/cat.shards/10_basic.yml +++ b/rest-api-spec/src/main/resources/rest-api-spec/test/cat.shards/10_basic.yml @@ -1,8 +1,8 @@ --- "Help": - skip: - version: " - 7.1.99" - reason: external refresh stats were added in 7.2.0 + version: " - 7.99.99" + reason: shard path stats were added in 8.0.0 - do: cat.shards: help: true @@ -78,6 +78,8 @@ warmer.current .+ \n warmer.total .+ \n warmer.total_time .+ \n + path.data .+ \n + path.state .+ \n $/ --- "Test cat shards output": diff --git a/rest-api-spec/src/main/resources/rest-api-spec/test/cluster.component_template/10_basic.yml b/rest-api-spec/src/main/resources/rest-api-spec/test/cluster.component_template/10_basic.yml new file mode 100644 index 0000000000000..51b1fa1bd6d7d --- /dev/null +++ b/rest-api-spec/src/main/resources/rest-api-spec/test/cluster.component_template/10_basic.yml @@ -0,0 +1,44 @@ +--- +"Basic CRUD": + + - do: + cluster.put_component_template: + name: test + body: + template: + settings: + number_of_shards: 1 + number_of_replicas: 0 + mappings: + properties: + field: + type: keyword + aliases: + aliasname: {} + version: 2 + _meta: + foo: bar + baz: + eggplant: true + + - do: + cluster.get_component_template: + name: test + + - match: {component_templates.0.name: test} + - match: {component_templates.0.component_template.version: 2} + - match: {component_templates.0.component_template._meta: {foo: bar, baz: {eggplant: true}}} + - match: {component_templates.0.component_template.template.settings: {index: {number_of_shards: '1', number_of_replicas: '0'}}} + - match: {component_templates.0.component_template.template.mappings: {properties: {field: {type: keyword}}}} + - match: {component_templates.0.component_template.template.aliases: {aliasname: {}}} + + - do: + cluster.delete_component_template: + name: test + + - do: + catch: missing + cluster.get_component_template: + name: test + + - is_false: test diff --git a/rest-api-spec/src/main/resources/rest-api-spec/test/indices.data_stream/10_basic.yml b/rest-api-spec/src/main/resources/rest-api-spec/test/indices.data_stream/10_basic.yml new file mode 100644 index 0000000000000..d21abfc11c754 --- /dev/null +++ b/rest-api-spec/src/main/resources/rest-api-spec/test/indices.data_stream/10_basic.yml @@ -0,0 +1,33 @@ +--- +"Create data stream": + - skip: + version: "all" + reason: "AwaitsFix https://github.com/elastic/elasticsearch/issues/54022" + + - do: + indices.create_data_stream: + name: simple-data-stream1 + body: + timestamp_field: "@timestamp" + - is_true: acknowledged + + - do: + indices.create_data_stream: + name: simple-data-stream2 + body: + timestamp_field: "@timestamp2" + - is_true: acknowledged + + - do: + indices.get_data_streams: {} + - match: { 0.name: simple-data-stream1 } + - match: { 0.timestamp_field: '@timestamp' } + - match: { 0.indices: [] } + - match: { 1.name: simple-data-stream2 } + - match: { 1.timestamp_field: '@timestamp2' } + - match: { 1.indices: [] } + + - do: + indices.delete_data_stream: + name: simple-data-stream2 + - is_true: acknowledged diff --git a/server/build.gradle b/server/build.gradle index fe0b2c3fee933..59c41c7679d20 100644 --- a/server/build.gradle +++ b/server/build.gradle @@ -35,7 +35,7 @@ publishing { archivesBaseName = 'elasticsearch' // we want to keep the JDKs in our IDEs set to JDK 11 until minimum JDK is bumped to 17 so we do not include this source set in our IDEs -if (!isEclipse && !isIdea) { +if (!isEclipse) { sourceSets { java12 { java { @@ -82,7 +82,7 @@ dependencies { compile project(":libs:elasticsearch-geo") compileOnly project(':libs:elasticsearch-plugin-classloader') - testRuntime project(':libs:elasticsearch-plugin-classloader') + testRuntimeOnly project(':libs:elasticsearch-plugin-classloader') // lucene compile "org.apache.lucene:lucene-core:${versions.lucene}" @@ -123,7 +123,7 @@ dependencies { // repackaged jna with native bits linked against all elastic supported platforms compile "org.elasticsearch:jna:${versions.jna}" - if (!isEclipse && !isIdea) { + if (!isEclipse) { java12Compile sourceSets.main.output } diff --git a/server/licenses/lucene-analyzers-common-8.5.0-snapshot-7f057455901.jar.sha1 b/server/licenses/lucene-analyzers-common-8.5.0-snapshot-7f057455901.jar.sha1 deleted file mode 100644 index 242bbfc9bd604..0000000000000 --- a/server/licenses/lucene-analyzers-common-8.5.0-snapshot-7f057455901.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -0365c37a03123ee8e30f75e44a1cb7d5ddd2fc52 \ No newline at end of file diff --git a/server/licenses/lucene-analyzers-common-8.5.0.jar.sha1 b/server/licenses/lucene-analyzers-common-8.5.0.jar.sha1 new file mode 100644 index 0000000000000..398344daaa992 --- /dev/null +++ b/server/licenses/lucene-analyzers-common-8.5.0.jar.sha1 @@ -0,0 +1 @@ +7156f2e545fd6e7faaee4781d15eb60cf5f07646 \ No newline at end of file diff --git a/server/licenses/lucene-backward-codecs-8.5.0-snapshot-7f057455901.jar.sha1 b/server/licenses/lucene-backward-codecs-8.5.0-snapshot-7f057455901.jar.sha1 deleted file mode 100644 index 056fe4a00cbb5..0000000000000 --- a/server/licenses/lucene-backward-codecs-8.5.0-snapshot-7f057455901.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -d56b30f75b2df92da8c6c0965ce72e7abb86347b \ No newline at end of file diff --git a/server/licenses/lucene-backward-codecs-8.5.0.jar.sha1 b/server/licenses/lucene-backward-codecs-8.5.0.jar.sha1 new file mode 100644 index 0000000000000..1f926add1ffa2 --- /dev/null +++ b/server/licenses/lucene-backward-codecs-8.5.0.jar.sha1 @@ -0,0 +1 @@ +5837d9ec231b998d9eb75a99f3bf1dc9748c8f46 \ No newline at end of file diff --git a/server/licenses/lucene-core-8.5.0-snapshot-7f057455901.jar.sha1 b/server/licenses/lucene-core-8.5.0-snapshot-7f057455901.jar.sha1 deleted file mode 100644 index fe30b9975cab4..0000000000000 --- a/server/licenses/lucene-core-8.5.0-snapshot-7f057455901.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -39933692162e28c2719b60f499204b28236a2858 \ No newline at end of file diff --git a/server/licenses/lucene-core-8.5.0.jar.sha1 b/server/licenses/lucene-core-8.5.0.jar.sha1 new file mode 100644 index 0000000000000..588b450154d09 --- /dev/null +++ b/server/licenses/lucene-core-8.5.0.jar.sha1 @@ -0,0 +1 @@ +3f9ea85fff4fc3f7c83869dddb9b0ef7818c0cae \ No newline at end of file diff --git a/server/licenses/lucene-grouping-8.5.0-snapshot-7f057455901.jar.sha1 b/server/licenses/lucene-grouping-8.5.0-snapshot-7f057455901.jar.sha1 deleted file mode 100644 index c3ef488e826f8..0000000000000 --- a/server/licenses/lucene-grouping-8.5.0-snapshot-7f057455901.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -2e56cc12d2f77d82946299b66f3416f9e621b2f3 \ No newline at end of file diff --git a/server/licenses/lucene-grouping-8.5.0.jar.sha1 b/server/licenses/lucene-grouping-8.5.0.jar.sha1 new file mode 100644 index 0000000000000..89c60926a4a82 --- /dev/null +++ b/server/licenses/lucene-grouping-8.5.0.jar.sha1 @@ -0,0 +1 @@ +08d26d94f32b38d15eaf68b17bef52158e4bbc87 \ No newline at end of file diff --git a/server/licenses/lucene-highlighter-8.5.0-snapshot-7f057455901.jar.sha1 b/server/licenses/lucene-highlighter-8.5.0-snapshot-7f057455901.jar.sha1 deleted file mode 100644 index cfa7fa961780b..0000000000000 --- a/server/licenses/lucene-highlighter-8.5.0-snapshot-7f057455901.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -5cddb5b65e7ead641483dcc2ffb0e50ad8d26eb7 \ No newline at end of file diff --git a/server/licenses/lucene-highlighter-8.5.0.jar.sha1 b/server/licenses/lucene-highlighter-8.5.0.jar.sha1 new file mode 100644 index 0000000000000..9ec3d4b53fef0 --- /dev/null +++ b/server/licenses/lucene-highlighter-8.5.0.jar.sha1 @@ -0,0 +1 @@ +8c653f47ea042dec2920bab83b039774b567eb9f \ No newline at end of file diff --git a/server/licenses/lucene-join-8.5.0-snapshot-7f057455901.jar.sha1 b/server/licenses/lucene-join-8.5.0-snapshot-7f057455901.jar.sha1 deleted file mode 100644 index 94a18fd3c3913..0000000000000 --- a/server/licenses/lucene-join-8.5.0-snapshot-7f057455901.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -cfabaedd80fe600cc7fda5ee12d90927fa96d87c \ No newline at end of file diff --git a/server/licenses/lucene-join-8.5.0.jar.sha1 b/server/licenses/lucene-join-8.5.0.jar.sha1 new file mode 100644 index 0000000000000..fa153682841ab --- /dev/null +++ b/server/licenses/lucene-join-8.5.0.jar.sha1 @@ -0,0 +1 @@ +98713495a3f48558a5acd7aaa22bfa7da394e78a \ No newline at end of file diff --git a/server/licenses/lucene-memory-8.5.0-snapshot-7f057455901.jar.sha1 b/server/licenses/lucene-memory-8.5.0-snapshot-7f057455901.jar.sha1 deleted file mode 100644 index ae7da5c259f69..0000000000000 --- a/server/licenses/lucene-memory-8.5.0-snapshot-7f057455901.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -e90fbcc53531978fc03ef847ba396d4cdd89c7e4 \ No newline at end of file diff --git a/server/licenses/lucene-memory-8.5.0.jar.sha1 b/server/licenses/lucene-memory-8.5.0.jar.sha1 new file mode 100644 index 0000000000000..402e037eb1a20 --- /dev/null +++ b/server/licenses/lucene-memory-8.5.0.jar.sha1 @@ -0,0 +1 @@ +18e3ed87c7f29bb0fe4b5db244a4f31018a9e518 \ No newline at end of file diff --git a/server/licenses/lucene-misc-8.5.0-snapshot-7f057455901.jar.sha1 b/server/licenses/lucene-misc-8.5.0-snapshot-7f057455901.jar.sha1 deleted file mode 100644 index 8f6c2fe6d63c1..0000000000000 --- a/server/licenses/lucene-misc-8.5.0-snapshot-7f057455901.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -dd0b4cef132a50b3fa919f214a5316fcc78c46ea \ No newline at end of file diff --git a/server/licenses/lucene-misc-8.5.0.jar.sha1 b/server/licenses/lucene-misc-8.5.0.jar.sha1 new file mode 100644 index 0000000000000..fa96c7ab7e70c --- /dev/null +++ b/server/licenses/lucene-misc-8.5.0.jar.sha1 @@ -0,0 +1 @@ +9786e13764f55dbf28e71fe7e0a90d1e94bea0bc \ No newline at end of file diff --git a/server/licenses/lucene-queries-8.5.0-snapshot-7f057455901.jar.sha1 b/server/licenses/lucene-queries-8.5.0-snapshot-7f057455901.jar.sha1 deleted file mode 100644 index 4b2a346309f3f..0000000000000 --- a/server/licenses/lucene-queries-8.5.0-snapshot-7f057455901.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -c90cc35089afc3f7802668c3969b5e7391b6d15a \ No newline at end of file diff --git a/server/licenses/lucene-queries-8.5.0.jar.sha1 b/server/licenses/lucene-queries-8.5.0.jar.sha1 new file mode 100644 index 0000000000000..ec92e3918b4d5 --- /dev/null +++ b/server/licenses/lucene-queries-8.5.0.jar.sha1 @@ -0,0 +1 @@ +708c1f850ed70c506822b021a722e42f29c397a1 \ No newline at end of file diff --git a/server/licenses/lucene-queryparser-8.5.0-snapshot-7f057455901.jar.sha1 b/server/licenses/lucene-queryparser-8.5.0-snapshot-7f057455901.jar.sha1 deleted file mode 100644 index 3b5ec91e4f938..0000000000000 --- a/server/licenses/lucene-queryparser-8.5.0-snapshot-7f057455901.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -e324233cb8f069e4f6abcbab47368a83c3696f36 \ No newline at end of file diff --git a/server/licenses/lucene-queryparser-8.5.0.jar.sha1 b/server/licenses/lucene-queryparser-8.5.0.jar.sha1 new file mode 100644 index 0000000000000..3fe3e5fc193b0 --- /dev/null +++ b/server/licenses/lucene-queryparser-8.5.0.jar.sha1 @@ -0,0 +1 @@ +13c38f39b1a7d10c4749ba789fa95da5868d4885 \ No newline at end of file diff --git a/server/licenses/lucene-sandbox-8.5.0-snapshot-7f057455901.jar.sha1 b/server/licenses/lucene-sandbox-8.5.0-snapshot-7f057455901.jar.sha1 deleted file mode 100644 index 58d7518e7aefc..0000000000000 --- a/server/licenses/lucene-sandbox-8.5.0-snapshot-7f057455901.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -985a451e5f564c84271419a446e044ab589d6f22 \ No newline at end of file diff --git a/server/licenses/lucene-sandbox-8.5.0.jar.sha1 b/server/licenses/lucene-sandbox-8.5.0.jar.sha1 new file mode 100644 index 0000000000000..f36b8b978bede --- /dev/null +++ b/server/licenses/lucene-sandbox-8.5.0.jar.sha1 @@ -0,0 +1 @@ +2b275921f2fd92b15b4f1a2a565467c3fa221ef9 \ No newline at end of file diff --git a/server/licenses/lucene-spatial-extras-8.5.0-snapshot-7f057455901.jar.sha1 b/server/licenses/lucene-spatial-extras-8.5.0-snapshot-7f057455901.jar.sha1 deleted file mode 100644 index f51b876ad5fdf..0000000000000 --- a/server/licenses/lucene-spatial-extras-8.5.0-snapshot-7f057455901.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -beff7cafe0fa5330b9b915825b69321faf0fcaa9 \ No newline at end of file diff --git a/server/licenses/lucene-spatial-extras-8.5.0.jar.sha1 b/server/licenses/lucene-spatial-extras-8.5.0.jar.sha1 new file mode 100644 index 0000000000000..02c7de2cd466f --- /dev/null +++ b/server/licenses/lucene-spatial-extras-8.5.0.jar.sha1 @@ -0,0 +1 @@ +a8603576227b03fa94c2cde81b877f711c8b4c3f \ No newline at end of file diff --git a/server/licenses/lucene-spatial3d-8.5.0-snapshot-7f057455901.jar.sha1 b/server/licenses/lucene-spatial3d-8.5.0-snapshot-7f057455901.jar.sha1 deleted file mode 100644 index 736c206ba1b48..0000000000000 --- a/server/licenses/lucene-spatial3d-8.5.0-snapshot-7f057455901.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -b9256d3a2a64d79435a4c726af8a3c28c2b77d7f \ No newline at end of file diff --git a/server/licenses/lucene-spatial3d-8.5.0.jar.sha1 b/server/licenses/lucene-spatial3d-8.5.0.jar.sha1 new file mode 100644 index 0000000000000..df28d9295c4a0 --- /dev/null +++ b/server/licenses/lucene-spatial3d-8.5.0.jar.sha1 @@ -0,0 +1 @@ +6f05de532ca9497e4d8364addf123441b01372a8 \ No newline at end of file diff --git a/server/licenses/lucene-suggest-8.5.0-snapshot-7f057455901.jar.sha1 b/server/licenses/lucene-suggest-8.5.0-snapshot-7f057455901.jar.sha1 deleted file mode 100644 index 586728768c5b4..0000000000000 --- a/server/licenses/lucene-suggest-8.5.0-snapshot-7f057455901.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -f38949db273a910e94a57229db2d8f3e4aef5e1f \ No newline at end of file diff --git a/server/licenses/lucene-suggest-8.5.0.jar.sha1 b/server/licenses/lucene-suggest-8.5.0.jar.sha1 new file mode 100644 index 0000000000000..b0abc5dc57319 --- /dev/null +++ b/server/licenses/lucene-suggest-8.5.0.jar.sha1 @@ -0,0 +1 @@ +4ae6bea433acecbbaf7ae8fa3d56941ec2ad004d \ No newline at end of file diff --git a/server/src/main/java/org/elasticsearch/action/ActionModule.java b/server/src/main/java/org/elasticsearch/action/ActionModule.java index affbb7a41dd31..c64ba9771ff39 100644 --- a/server/src/main/java/org/elasticsearch/action/ActionModule.java +++ b/server/src/main/java/org/elasticsearch/action/ActionModule.java @@ -21,12 +21,16 @@ import org.apache.logging.log4j.LogManager; import org.apache.logging.log4j.Logger; +import org.elasticsearch.Build; import org.elasticsearch.action.admin.cluster.allocation.ClusterAllocationExplainAction; import org.elasticsearch.action.admin.cluster.allocation.TransportClusterAllocationExplainAction; import org.elasticsearch.action.admin.cluster.configuration.AddVotingConfigExclusionsAction; import org.elasticsearch.action.admin.cluster.configuration.ClearVotingConfigExclusionsAction; import org.elasticsearch.action.admin.cluster.configuration.TransportAddVotingConfigExclusionsAction; import org.elasticsearch.action.admin.cluster.configuration.TransportClearVotingConfigExclusionsAction; +import org.elasticsearch.action.admin.indices.datastream.DeleteDataStreamAction; +import org.elasticsearch.action.admin.indices.datastream.GetDataStreamsAction; +import org.elasticsearch.action.admin.indices.datastream.CreateDataStreamAction; import org.elasticsearch.action.admin.cluster.health.ClusterHealthAction; import org.elasticsearch.action.admin.cluster.health.TransportClusterHealthAction; import org.elasticsearch.action.admin.cluster.node.hotthreads.NodesHotThreadsAction; @@ -142,11 +146,17 @@ import org.elasticsearch.action.admin.indices.shrink.TransportResizeAction; import org.elasticsearch.action.admin.indices.stats.IndicesStatsAction; import org.elasticsearch.action.admin.indices.stats.TransportIndicesStatsAction; +import org.elasticsearch.action.admin.indices.template.delete.DeleteComponentTemplateAction; import org.elasticsearch.action.admin.indices.template.delete.DeleteIndexTemplateAction; +import org.elasticsearch.action.admin.indices.template.delete.TransportDeleteComponentTemplateAction; import org.elasticsearch.action.admin.indices.template.delete.TransportDeleteIndexTemplateAction; +import org.elasticsearch.action.admin.indices.template.get.GetComponentTemplateAction; import org.elasticsearch.action.admin.indices.template.get.GetIndexTemplatesAction; +import org.elasticsearch.action.admin.indices.template.get.TransportGetComponentTemplateAction; import org.elasticsearch.action.admin.indices.template.get.TransportGetIndexTemplatesAction; +import org.elasticsearch.action.admin.indices.template.put.PutComponentTemplateAction; import org.elasticsearch.action.admin.indices.template.put.PutIndexTemplateAction; +import org.elasticsearch.action.admin.indices.template.put.TransportPutComponentTemplateAction; import org.elasticsearch.action.admin.indices.template.put.TransportPutIndexTemplateAction; import org.elasticsearch.action.admin.indices.upgrade.get.TransportUpgradeStatusAction; import org.elasticsearch.action.admin.indices.upgrade.get.UpgradeStatusAction; @@ -205,6 +215,7 @@ import org.elasticsearch.client.node.NodeClient; import org.elasticsearch.cluster.metadata.IndexNameExpressionResolver; import org.elasticsearch.cluster.node.DiscoveryNodes; +import org.elasticsearch.cluster.service.ClusterService; import org.elasticsearch.common.NamedRegistry; import org.elasticsearch.common.inject.AbstractModule; import org.elasticsearch.common.inject.TypeLiteral; @@ -243,9 +254,11 @@ import org.elasticsearch.rest.action.admin.cluster.RestClusterStatsAction; import org.elasticsearch.rest.action.admin.cluster.RestClusterUpdateSettingsAction; import org.elasticsearch.rest.action.admin.cluster.RestCreateSnapshotAction; +import org.elasticsearch.rest.action.admin.indices.RestDeleteDataStreamAction; import org.elasticsearch.rest.action.admin.cluster.RestDeleteRepositoryAction; import org.elasticsearch.rest.action.admin.cluster.RestDeleteSnapshotAction; import org.elasticsearch.rest.action.admin.cluster.RestDeleteStoredScriptAction; +import org.elasticsearch.rest.action.admin.indices.RestGetDataStreamsAction; import org.elasticsearch.rest.action.admin.cluster.RestGetRepositoriesAction; import org.elasticsearch.rest.action.admin.cluster.RestGetScriptContextAction; import org.elasticsearch.rest.action.admin.cluster.RestGetScriptLanguageAction; @@ -258,6 +271,7 @@ import org.elasticsearch.rest.action.admin.cluster.RestNodesStatsAction; import org.elasticsearch.rest.action.admin.cluster.RestNodesUsageAction; import org.elasticsearch.rest.action.admin.cluster.RestPendingClusterTasksAction; +import org.elasticsearch.rest.action.admin.indices.RestCreateDataStreamAction; import org.elasticsearch.rest.action.admin.cluster.RestPutRepositoryAction; import org.elasticsearch.rest.action.admin.cluster.RestPutStoredScriptAction; import org.elasticsearch.rest.action.admin.cluster.RestReloadSecureSettingsAction; @@ -269,11 +283,13 @@ import org.elasticsearch.rest.action.admin.indices.RestClearIndicesCacheAction; import org.elasticsearch.rest.action.admin.indices.RestCloseIndexAction; import org.elasticsearch.rest.action.admin.indices.RestCreateIndexAction; +import org.elasticsearch.rest.action.admin.indices.RestDeleteComponentTemplateAction; import org.elasticsearch.rest.action.admin.indices.RestDeleteIndexAction; import org.elasticsearch.rest.action.admin.indices.RestDeleteIndexTemplateAction; import org.elasticsearch.rest.action.admin.indices.RestFlushAction; import org.elasticsearch.rest.action.admin.indices.RestForceMergeAction; import org.elasticsearch.rest.action.admin.indices.RestGetAliasesAction; +import org.elasticsearch.rest.action.admin.indices.RestGetComponentTemplateAction; import org.elasticsearch.rest.action.admin.indices.RestGetFieldMappingAction; import org.elasticsearch.rest.action.admin.indices.RestGetIndexTemplateAction; import org.elasticsearch.rest.action.admin.indices.RestGetIndicesAction; @@ -286,6 +302,7 @@ import org.elasticsearch.rest.action.admin.indices.RestIndicesShardStoresAction; import org.elasticsearch.rest.action.admin.indices.RestIndicesStatsAction; import org.elasticsearch.rest.action.admin.indices.RestOpenIndexAction; +import org.elasticsearch.rest.action.admin.indices.RestPutComponentTemplateAction; import org.elasticsearch.rest.action.admin.indices.RestPutIndexTemplateAction; import org.elasticsearch.rest.action.admin.indices.RestPutMappingAction; import org.elasticsearch.rest.action.admin.indices.RestRecoveryAction; @@ -361,6 +378,35 @@ public class ActionModule extends AbstractModule { private static final Logger logger = LogManager.getLogger(ActionModule.class); + private static final boolean ITV2_FEATURE_FLAG_REGISTERED; + + static { + final String property = System.getProperty("es.itv2_feature_flag_registered"); + if (Build.CURRENT.isSnapshot() || "true".equals(property)) { + ITV2_FEATURE_FLAG_REGISTERED = true; + } else if ("false".equals(property) || property == null) { + ITV2_FEATURE_FLAG_REGISTERED = false; + } else { + throw new IllegalArgumentException("expected es.itv2_feature_flag_registered to be unset, true, or false but was [" + + property + "]"); + } + } + + private static final boolean DATASTREAMS_FEATURE_FLAG_REGISTERED; + + static { + final String property = System.getProperty("es.datastreams_feature_flag_registered"); + if (Build.CURRENT.isSnapshot() || "true".equals(property)) { + DATASTREAMS_FEATURE_FLAG_REGISTERED = true; + } else if ("false".equals(property) || property == null) { + DATASTREAMS_FEATURE_FLAG_REGISTERED = false; + } else { + throw new IllegalArgumentException( + "expected es.datastreams_feature_flag_registered to be unset or [true|false] but was [" + property + "]" + ); + } + } + private final Settings settings; private final IndexNameExpressionResolver indexNameExpressionResolver; private final IndexScopedSettings indexScopedSettings; @@ -374,17 +420,19 @@ public class ActionModule extends AbstractModule { private final RestController restController; private final RequestValidators mappingRequestValidators; private final RequestValidators indicesAliasesRequestRequestValidators; + private final ClusterService clusterService; public ActionModule(Settings settings, IndexNameExpressionResolver indexNameExpressionResolver, IndexScopedSettings indexScopedSettings, ClusterSettings clusterSettings, SettingsFilter settingsFilter, ThreadPool threadPool, List actionPlugins, NodeClient nodeClient, - CircuitBreakerService circuitBreakerService, UsageService usageService) { + CircuitBreakerService circuitBreakerService, UsageService usageService, ClusterService clusterService) { this.settings = settings; this.indexNameExpressionResolver = indexNameExpressionResolver; this.indexScopedSettings = indexScopedSettings; this.clusterSettings = clusterSettings; this.settingsFilter = settingsFilter; this.actionPlugins = actionPlugins; + this.clusterService = clusterService; actions = setupActions(actionPlugins); actionFilters = setupActionFilters(actionPlugins); autoCreateIndex = new AutoCreateIndex(settings, clusterSettings, indexNameExpressionResolver); @@ -409,10 +457,10 @@ public ActionModule(Settings settings, IndexNameExpressionResolver indexNameExpr indicesAliasesRequestRequestValidators = new RequestValidators<>( actionPlugins.stream().flatMap(p -> p.indicesAliasesRequestValidators().stream()).collect(Collectors.toList())); - final boolean restrictSystemIndices = RestController.RESTRICT_SYSTEM_INDICES.get(settings); - restController = new RestController(headers, restWrapper, nodeClient, circuitBreakerService, usageService, restrictSystemIndices); + restController = new RestController(headers, restWrapper, nodeClient, circuitBreakerService, usageService); } + public Map> getActions() { return actions; } @@ -486,6 +534,11 @@ public void reg actions.register(PutIndexTemplateAction.INSTANCE, TransportPutIndexTemplateAction.class); actions.register(GetIndexTemplatesAction.INSTANCE, TransportGetIndexTemplatesAction.class); actions.register(DeleteIndexTemplateAction.INSTANCE, TransportDeleteIndexTemplateAction.class); + if (ITV2_FEATURE_FLAG_REGISTERED) { + actions.register(PutComponentTemplateAction.INSTANCE, TransportPutComponentTemplateAction.class); + actions.register(GetComponentTemplateAction.INSTANCE, TransportGetComponentTemplateAction.class); + actions.register(DeleteComponentTemplateAction.INSTANCE, TransportDeleteComponentTemplateAction.class); + } actions.register(ValidateQueryAction.INSTANCE, TransportValidateQueryAction.class); actions.register(RefreshAction.INSTANCE, TransportRefreshAction.class); actions.register(FlushAction.INSTANCE, TransportFlushAction.class); @@ -533,6 +586,13 @@ public void reg actionPlugins.stream().flatMap(p -> p.getActions().stream()).forEach(actions::register); + // Data streams: + if (DATASTREAMS_FEATURE_FLAG_REGISTERED) { + actions.register(CreateDataStreamAction.INSTANCE, CreateDataStreamAction.TransportAction.class); + actions.register(DeleteDataStreamAction.INSTANCE, DeleteDataStreamAction.TransportAction.class); + actions.register(GetDataStreamsAction.INSTANCE, GetDataStreamsAction.TransportAction.class); + } + // Persistent tasks: actions.register(StartPersistentTaskAction.INSTANCE, StartPersistentTaskAction.TransportAction.class); actions.register(UpdatePersistentTaskStatusAction.INSTANCE, UpdatePersistentTaskStatusAction.TransportAction.class); @@ -621,6 +681,11 @@ public void initRestHandlers(Supplier nodesInCluster) { registerHandler.accept(new RestGetIndexTemplateAction()); registerHandler.accept(new RestPutIndexTemplateAction()); registerHandler.accept(new RestDeleteIndexTemplateAction()); + if (ITV2_FEATURE_FLAG_REGISTERED) { + registerHandler.accept(new RestPutComponentTemplateAction()); + registerHandler.accept(new RestGetComponentTemplateAction()); + registerHandler.accept(new RestDeleteComponentTemplateAction()); + } registerHandler.accept(new RestPutMappingAction()); registerHandler.accept(new RestGetMappingAction()); @@ -636,7 +701,7 @@ public void initRestHandlers(Supplier nodesInCluster) { registerHandler.accept(new RestIndexAction()); registerHandler.accept(new CreateHandler()); - registerHandler.accept(new AutoIdHandler(nodesInCluster)); + registerHandler.accept(new AutoIdHandler(clusterService)); registerHandler.accept(new RestGetAction()); registerHandler.accept(new RestGetSourceAction()); registerHandler.accept(new RestMultiGetAction(settings)); @@ -680,6 +745,13 @@ public void initRestHandlers(Supplier nodesInCluster) { registerHandler.accept(new RestDeletePipelineAction()); registerHandler.accept(new RestSimulatePipelineAction()); + // Data Stream API + if (DATASTREAMS_FEATURE_FLAG_REGISTERED) { + registerHandler.accept(new RestCreateDataStreamAction()); + registerHandler.accept(new RestDeleteDataStreamAction()); + registerHandler.accept(new RestGetDataStreamsAction()); + } + // CAT API registerHandler.accept(new RestAllocationAction()); registerHandler.accept(new RestShardsAction()); diff --git a/server/src/main/java/org/elasticsearch/action/admin/cluster/configuration/TransportAddVotingConfigExclusionsAction.java b/server/src/main/java/org/elasticsearch/action/admin/cluster/configuration/TransportAddVotingConfigExclusionsAction.java index 7dcbd863d29e1..a18e5eb6c67cb 100644 --- a/server/src/main/java/org/elasticsearch/action/admin/cluster/configuration/TransportAddVotingConfigExclusionsAction.java +++ b/server/src/main/java/org/elasticsearch/action/admin/cluster/configuration/TransportAddVotingConfigExclusionsAction.java @@ -39,8 +39,10 @@ import org.elasticsearch.common.Priority; import org.elasticsearch.common.inject.Inject; import org.elasticsearch.common.io.stream.StreamInput; +import org.elasticsearch.common.settings.ClusterSettings; import org.elasticsearch.common.settings.Setting; import org.elasticsearch.common.settings.Setting.Property; +import org.elasticsearch.common.settings.Settings; import org.elasticsearch.common.unit.TimeValue; import org.elasticsearch.tasks.Task; import org.elasticsearch.threadpool.ThreadPool; @@ -60,11 +62,21 @@ public class TransportAddVotingConfigExclusionsAction extends TransportMasterNod public static final Setting MAXIMUM_VOTING_CONFIG_EXCLUSIONS_SETTING = Setting.intSetting("cluster.max_voting_config_exclusions", 10, 1, Property.Dynamic, Property.NodeScope); + private volatile int maxVotingConfigExclusions; + @Inject - public TransportAddVotingConfigExclusionsAction(TransportService transportService, ClusterService clusterService, ThreadPool threadPool, - ActionFilters actionFilters, IndexNameExpressionResolver indexNameExpressionResolver) { + public TransportAddVotingConfigExclusionsAction(Settings settings, ClusterSettings clusterSettings, TransportService transportService, + ClusterService clusterService, ThreadPool threadPool, ActionFilters actionFilters, + IndexNameExpressionResolver indexNameExpressionResolver) { super(AddVotingConfigExclusionsAction.NAME, transportService, clusterService, threadPool, actionFilters, AddVotingConfigExclusionsRequest::new, indexNameExpressionResolver); + + maxVotingConfigExclusions = MAXIMUM_VOTING_CONFIG_EXCLUSIONS_SETTING.get(settings); + clusterSettings.addSettingsUpdateConsumer(MAXIMUM_VOTING_CONFIG_EXCLUSIONS_SETTING, this::setMaxVotingConfigExclusions); + } + + private void setMaxVotingConfigExclusions(int maxVotingConfigExclusions) { + this.maxVotingConfigExclusions = maxVotingConfigExclusions; } @Override @@ -81,7 +93,8 @@ protected AddVotingConfigExclusionsResponse read(StreamInput in) throws IOExcept protected void masterOperation(Task task, AddVotingConfigExclusionsRequest request, ClusterState state, ActionListener listener) throws Exception { - resolveVotingConfigExclusionsAndCheckMaximum(request, state); // throws IAE if no nodes matched or maximum exceeded + resolveVotingConfigExclusionsAndCheckMaximum(request, state, maxVotingConfigExclusions); + // throws IAE if no nodes matched or maximum exceeded clusterService.submitStateUpdateTask("add-voting-config-exclusions", new ClusterStateUpdateTask(Priority.URGENT) { @@ -90,14 +103,14 @@ protected void masterOperation(Task task, AddVotingConfigExclusionsRequest reque @Override public ClusterState execute(ClusterState currentState) { assert resolvedExclusions == null : resolvedExclusions; - resolvedExclusions = resolveVotingConfigExclusionsAndCheckMaximum(request, currentState); + final int finalMaxVotingConfigExclusions = TransportAddVotingConfigExclusionsAction.this.maxVotingConfigExclusions; + resolvedExclusions = resolveVotingConfigExclusionsAndCheckMaximum(request, currentState, finalMaxVotingConfigExclusions); final CoordinationMetaData.Builder builder = CoordinationMetaData.builder(currentState.coordinationMetaData()); resolvedExclusions.forEach(builder::addVotingConfigExclusion); final MetaData newMetaData = MetaData.builder(currentState.metaData()).coordinationMetaData(builder.build()).build(); final ClusterState newState = ClusterState.builder(currentState).metaData(newMetaData).build(); - assert newState.getVotingConfigExclusions().size() <= MAXIMUM_VOTING_CONFIG_EXCLUSIONS_SETTING.get( - currentState.metaData().settings()); + assert newState.getVotingConfigExclusions().size() <= finalMaxVotingConfigExclusions; return newState; } @@ -149,9 +162,10 @@ public void onTimeout(TimeValue timeout) { } private static Set resolveVotingConfigExclusionsAndCheckMaximum(AddVotingConfigExclusionsRequest request, - ClusterState state) { - return request.resolveVotingConfigExclusionsAndCheckMaximum(state, - MAXIMUM_VOTING_CONFIG_EXCLUSIONS_SETTING.get(state.metaData().settings()), MAXIMUM_VOTING_CONFIG_EXCLUSIONS_SETTING.getKey()); + ClusterState state, + int maxVotingConfigExclusions) { + return request.resolveVotingConfigExclusionsAndCheckMaximum(state, maxVotingConfigExclusions, + MAXIMUM_VOTING_CONFIG_EXCLUSIONS_SETTING.getKey()); } @Override diff --git a/server/src/main/java/org/elasticsearch/action/admin/cluster/health/ClusterHealthRequest.java b/server/src/main/java/org/elasticsearch/action/admin/cluster/health/ClusterHealthRequest.java index bbe24d7c0443f..ef6538f210d45 100644 --- a/server/src/main/java/org/elasticsearch/action/admin/cluster/health/ClusterHealthRequest.java +++ b/server/src/main/java/org/elasticsearch/action/admin/cluster/health/ClusterHealthRequest.java @@ -73,7 +73,7 @@ public ClusterHealthRequest(StreamInput in) throws IOException { } timeout = in.readTimeValue(); if (in.readBoolean()) { - waitForStatus = ClusterHealthStatus.fromValue(in.readByte()); + waitForStatus = ClusterHealthStatus.readFrom(in); } waitForNoRelocatingShards = in.readBoolean(); waitForActiveShards = ActiveShardCount.readFrom(in); diff --git a/server/src/main/java/org/elasticsearch/action/admin/cluster/health/ClusterHealthResponse.java b/server/src/main/java/org/elasticsearch/action/admin/cluster/health/ClusterHealthResponse.java index 163c66003f4d7..2bcae763201fa 100644 --- a/server/src/main/java/org/elasticsearch/action/admin/cluster/health/ClusterHealthResponse.java +++ b/server/src/main/java/org/elasticsearch/action/admin/cluster/health/ClusterHealthResponse.java @@ -147,7 +147,7 @@ public ClusterHealthResponse() {} public ClusterHealthResponse(StreamInput in) throws IOException { super(in); clusterName = in.readString(); - clusterHealthStatus = ClusterHealthStatus.fromValue(in.readByte()); + clusterHealthStatus = ClusterHealthStatus.readFrom(in); clusterStateHealth = new ClusterStateHealth(in); numberOfPendingTasks = in.readInt(); timedOut = in.readBoolean(); diff --git a/server/src/main/java/org/elasticsearch/action/admin/cluster/stats/ClusterStatsNodeResponse.java b/server/src/main/java/org/elasticsearch/action/admin/cluster/stats/ClusterStatsNodeResponse.java index b32ed7e39d239..d315fc4344a9e 100644 --- a/server/src/main/java/org/elasticsearch/action/admin/cluster/stats/ClusterStatsNodeResponse.java +++ b/server/src/main/java/org/elasticsearch/action/admin/cluster/stats/ClusterStatsNodeResponse.java @@ -42,7 +42,7 @@ public ClusterStatsNodeResponse(StreamInput in) throws IOException { super(in); clusterStatus = null; if (in.readBoolean()) { - clusterStatus = ClusterHealthStatus.fromValue(in.readByte()); + clusterStatus = ClusterHealthStatus.readFrom(in); } this.nodeInfo = new NodeInfo(in); this.nodeStats = new NodeStats(in); diff --git a/server/src/main/java/org/elasticsearch/action/admin/indices/alias/IndicesAliasesRequest.java b/server/src/main/java/org/elasticsearch/action/admin/indices/alias/IndicesAliasesRequest.java index 39f9d7239f5ce..8c77bfa24ebbf 100644 --- a/server/src/main/java/org/elasticsearch/action/admin/indices/alias/IndicesAliasesRequest.java +++ b/server/src/main/java/org/elasticsearch/action/admin/indices/alias/IndicesAliasesRequest.java @@ -553,12 +553,13 @@ public boolean equals(Object obj) { && Objects.equals(routing, other.routing) && Objects.equals(indexRouting, other.indexRouting) && Objects.equals(searchRouting, other.searchRouting) - && Objects.equals(writeIndex, other.writeIndex); + && Objects.equals(writeIndex, other.writeIndex) + && Objects.equals(isHidden, other.isHidden); } @Override public int hashCode() { - return Objects.hash(type, indices, aliases, filter, routing, indexRouting, searchRouting, writeIndex); + return Objects.hash(type, indices, aliases, filter, routing, indexRouting, searchRouting, writeIndex, isHidden); } } diff --git a/server/src/main/java/org/elasticsearch/action/admin/indices/datastream/CreateDataStreamAction.java b/server/src/main/java/org/elasticsearch/action/admin/indices/datastream/CreateDataStreamAction.java new file mode 100644 index 0000000000000..37f190691f33a --- /dev/null +++ b/server/src/main/java/org/elasticsearch/action/admin/indices/datastream/CreateDataStreamAction.java @@ -0,0 +1,181 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.elasticsearch.action.admin.indices.datastream; + +import org.apache.logging.log4j.LogManager; +import org.apache.logging.log4j.Logger; +import org.elasticsearch.action.ActionListener; +import org.elasticsearch.action.ActionRequestValidationException; +import org.elasticsearch.action.ActionType; +import org.elasticsearch.action.ValidateActions; +import org.elasticsearch.action.support.ActionFilters; +import org.elasticsearch.action.support.master.AcknowledgedResponse; +import org.elasticsearch.action.support.master.MasterNodeRequest; +import org.elasticsearch.action.support.master.TransportMasterNodeAction; +import org.elasticsearch.cluster.ClusterState; +import org.elasticsearch.cluster.ClusterStateUpdateTask; +import org.elasticsearch.cluster.block.ClusterBlockException; +import org.elasticsearch.cluster.block.ClusterBlockLevel; +import org.elasticsearch.cluster.metadata.DataStream; +import org.elasticsearch.cluster.metadata.IndexNameExpressionResolver; +import org.elasticsearch.cluster.metadata.MetaData; +import org.elasticsearch.cluster.service.ClusterService; +import org.elasticsearch.common.Priority; +import org.elasticsearch.common.Strings; +import org.elasticsearch.common.inject.Inject; +import org.elasticsearch.common.io.stream.StreamInput; +import org.elasticsearch.common.io.stream.StreamOutput; +import org.elasticsearch.common.unit.TimeValue; +import org.elasticsearch.tasks.Task; +import org.elasticsearch.threadpool.ThreadPool; +import org.elasticsearch.transport.TransportService; + +import java.io.IOException; +import java.util.Collections; +import java.util.Objects; + +public class CreateDataStreamAction extends ActionType { + + private static final Logger logger = LogManager.getLogger(CreateDataStreamAction.class); + + public static final CreateDataStreamAction INSTANCE = new CreateDataStreamAction(); + public static final String NAME = "indices:admin/data_stream/create"; + + private CreateDataStreamAction() { + super(NAME, AcknowledgedResponse::new); + } + + public static class Request extends MasterNodeRequest { + + private final String name; + private String timestampFieldName; + + public Request(String name) { + this.name = name; + } + + public void setTimestampFieldName(String timestampFieldName) { + this.timestampFieldName = timestampFieldName; + } + + @Override + public ActionRequestValidationException validate() { + ActionRequestValidationException validationException = null; + if (Strings.hasText(name) == false) { + validationException = ValidateActions.addValidationError("name is missing", validationException); + } + if (Strings.hasText(timestampFieldName) == false) { + validationException = ValidateActions.addValidationError("timestamp field name is missing", validationException); + } + return validationException; + } + + public Request(StreamInput in) throws IOException { + super(in); + this.name = in.readString(); + this.timestampFieldName = in.readString(); + } + + @Override + public void writeTo(StreamOutput out) throws IOException { + super.writeTo(out); + out.writeString(name); + out.writeString(timestampFieldName); + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + Request request = (Request) o; + return name.equals(request.name) && + timestampFieldName.equals(request.timestampFieldName); + } + + @Override + public int hashCode() { + return Objects.hash(name, timestampFieldName); + } + } + + public static class TransportAction extends TransportMasterNodeAction { + + @Inject + public TransportAction(TransportService transportService, ClusterService clusterService, ThreadPool threadPool, + ActionFilters actionFilters, IndexNameExpressionResolver indexNameExpressionResolver) { + super(NAME, transportService, clusterService, threadPool, actionFilters, Request::new, indexNameExpressionResolver); + } + + @Override + protected String executor() { + return ThreadPool.Names.SAME; + } + + @Override + protected AcknowledgedResponse read(StreamInput in) throws IOException { + return new AcknowledgedResponse(in); + } + + @Override + protected void masterOperation(Task task, Request request, ClusterState state, + ActionListener listener) throws Exception { + clusterService.submitStateUpdateTask("create-data-stream [" + request.name + "]", + new ClusterStateUpdateTask(Priority.HIGH) { + + @Override + public TimeValue timeout() { + return request.masterNodeTimeout(); + } + + @Override + public void onFailure(String source, Exception e) { + listener.onFailure(e); + } + + @Override + public ClusterState execute(ClusterState currentState) throws Exception { + return createDataStream(currentState, request); + } + + @Override + public void clusterStateProcessed(String source, ClusterState oldState, ClusterState newState) { + listener.onResponse(new AcknowledgedResponse(true)); + } + }); + } + + static ClusterState createDataStream(ClusterState currentState, Request request) { + if (currentState.metaData().dataStreams().containsKey(request.name)) { + throw new IllegalArgumentException("data_stream [" + request.name + "] already exists"); + } + + MetaData.Builder builder = MetaData.builder(currentState.metaData()).put( + new DataStream(request.name, request.timestampFieldName, Collections.emptyList())); + + logger.info("adding data stream [{}]", request.name); + return ClusterState.builder(currentState).metaData(builder).build(); + } + + @Override + protected ClusterBlockException checkBlock(Request request, ClusterState state) { + return state.blocks().globalBlockedException(ClusterBlockLevel.METADATA_WRITE); + } + } + +} diff --git a/server/src/main/java/org/elasticsearch/action/admin/indices/datastream/DeleteDataStreamAction.java b/server/src/main/java/org/elasticsearch/action/admin/indices/datastream/DeleteDataStreamAction.java new file mode 100644 index 0000000000000..91444ef64320b --- /dev/null +++ b/server/src/main/java/org/elasticsearch/action/admin/indices/datastream/DeleteDataStreamAction.java @@ -0,0 +1,182 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.elasticsearch.action.admin.indices.datastream; + +import org.apache.logging.log4j.LogManager; +import org.apache.logging.log4j.Logger; +import org.elasticsearch.ResourceNotFoundException; +import org.elasticsearch.action.ActionListener; +import org.elasticsearch.action.ActionRequestValidationException; +import org.elasticsearch.action.ActionType; +import org.elasticsearch.action.ValidateActions; +import org.elasticsearch.action.support.ActionFilters; +import org.elasticsearch.action.support.master.AcknowledgedResponse; +import org.elasticsearch.action.support.master.MasterNodeRequest; +import org.elasticsearch.action.support.master.TransportMasterNodeAction; +import org.elasticsearch.cluster.ClusterState; +import org.elasticsearch.cluster.ClusterStateUpdateTask; +import org.elasticsearch.cluster.block.ClusterBlockException; +import org.elasticsearch.cluster.block.ClusterBlockLevel; +import org.elasticsearch.cluster.metadata.IndexNameExpressionResolver; +import org.elasticsearch.cluster.metadata.MetaData; +import org.elasticsearch.cluster.service.ClusterService; +import org.elasticsearch.common.Priority; +import org.elasticsearch.common.Strings; +import org.elasticsearch.common.inject.Inject; +import org.elasticsearch.common.io.stream.StreamInput; +import org.elasticsearch.common.io.stream.StreamOutput; +import org.elasticsearch.common.regex.Regex; +import org.elasticsearch.common.unit.TimeValue; +import org.elasticsearch.tasks.Task; +import org.elasticsearch.threadpool.ThreadPool; +import org.elasticsearch.transport.TransportService; + +import java.io.IOException; +import java.util.HashSet; +import java.util.Objects; +import java.util.Set; + +public class DeleteDataStreamAction extends ActionType { + + private static final Logger logger = LogManager.getLogger(DeleteDataStreamAction.class); + + public static final DeleteDataStreamAction INSTANCE = new DeleteDataStreamAction(); + public static final String NAME = "indices:admin/data_stream/delete"; + + private DeleteDataStreamAction() { + super(NAME, AcknowledgedResponse::new); + } + + public static class Request extends MasterNodeRequest { + + private final String name; + + public Request(String name) { + this.name = Objects.requireNonNull(name); + } + + @Override + public ActionRequestValidationException validate() { + ActionRequestValidationException validationException = null; + if (Strings.hasText(name) == false) { + validationException = ValidateActions.addValidationError("name is missing", validationException); + } + return validationException; + } + + public Request(StreamInput in) throws IOException { + super(in); + this.name = in.readString(); + } + + @Override + public void writeTo(StreamOutput out) throws IOException { + super.writeTo(out); + out.writeString(name); + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + Request request = (Request) o; + return name.equals(request.name); + } + + @Override + public int hashCode() { + return Objects.hash(name); + } + } + + public static class TransportAction extends TransportMasterNodeAction { + + @Inject + public TransportAction(TransportService transportService, ClusterService clusterService, ThreadPool threadPool, + ActionFilters actionFilters, IndexNameExpressionResolver indexNameExpressionResolver) { + super(NAME, transportService, clusterService, threadPool, actionFilters, Request::new, indexNameExpressionResolver); + } + + @Override + protected String executor() { + return ThreadPool.Names.SAME; + } + + @Override + protected AcknowledgedResponse read(StreamInput in) throws IOException { + return new AcknowledgedResponse(in); + } + + @Override + protected void masterOperation(Task task, Request request, ClusterState state, + ActionListener listener) throws Exception { + clusterService.submitStateUpdateTask("remove-data-stream [" + request.name + "]", new ClusterStateUpdateTask(Priority.HIGH) { + + @Override + public TimeValue timeout() { + return request.masterNodeTimeout(); + } + + @Override + public void onFailure(String source, Exception e) { + listener.onFailure(e); + } + + @Override + public ClusterState execute(ClusterState currentState) { + return removeDataStream(currentState, request); + } + + @Override + public void clusterStateProcessed(String source, ClusterState oldState, ClusterState newState) { + listener.onResponse(new AcknowledgedResponse(true)); + } + }); + } + + static ClusterState removeDataStream(ClusterState currentState, Request request) { + Set dataStreams = new HashSet<>(); + for (String dataStreamName : currentState.metaData().dataStreams().keySet()) { + if (Regex.simpleMatch(request.name, dataStreamName)) { + dataStreams.add(dataStreamName); + } + } + if (dataStreams.isEmpty()) { + // if a match-all pattern was specified and no data streams were found because none exist, do not + // fail with data stream missing exception + if (Regex.isMatchAllPattern(request.name)) { + return currentState; + } + throw new ResourceNotFoundException("data_streams matching [" + request.name + "] not found"); + } + MetaData.Builder metaData = MetaData.builder(currentState.metaData()); + for (String dataStreamName : dataStreams) { + logger.info("removing data stream [{}]", dataStreamName); + metaData.removeDataStream(dataStreamName); + } + return ClusterState.builder(currentState).metaData(metaData).build(); + } + + @Override + protected ClusterBlockException checkBlock(Request request, ClusterState state) { + return state.blocks().globalBlockedException(ClusterBlockLevel.METADATA_WRITE); + } + } + +} diff --git a/server/src/main/java/org/elasticsearch/action/admin/indices/datastream/GetDataStreamsAction.java b/server/src/main/java/org/elasticsearch/action/admin/indices/datastream/GetDataStreamsAction.java new file mode 100644 index 0000000000000..ed2411401aeb6 --- /dev/null +++ b/server/src/main/java/org/elasticsearch/action/admin/indices/datastream/GetDataStreamsAction.java @@ -0,0 +1,192 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ +package org.elasticsearch.action.admin.indices.datastream; + +import org.elasticsearch.action.ActionListener; +import org.elasticsearch.action.ActionRequestValidationException; +import org.elasticsearch.action.ActionResponse; +import org.elasticsearch.action.ActionType; +import org.elasticsearch.action.support.ActionFilters; +import org.elasticsearch.action.support.master.MasterNodeReadRequest; +import org.elasticsearch.action.support.master.TransportMasterNodeReadAction; +import org.elasticsearch.cluster.ClusterState; +import org.elasticsearch.cluster.block.ClusterBlockException; +import org.elasticsearch.cluster.block.ClusterBlockLevel; +import org.elasticsearch.cluster.metadata.DataStream; +import org.elasticsearch.cluster.metadata.IndexNameExpressionResolver; +import org.elasticsearch.cluster.service.ClusterService; +import org.elasticsearch.common.inject.Inject; +import org.elasticsearch.common.io.stream.StreamInput; +import org.elasticsearch.common.io.stream.StreamOutput; +import org.elasticsearch.common.regex.Regex; +import org.elasticsearch.common.xcontent.ToXContentObject; +import org.elasticsearch.common.xcontent.XContentBuilder; +import org.elasticsearch.tasks.Task; +import org.elasticsearch.threadpool.ThreadPool; +import org.elasticsearch.transport.TransportService; + +import java.io.IOException; +import java.util.ArrayList; +import java.util.Arrays; +import java.util.List; +import java.util.Map; +import java.util.Objects; + +public class GetDataStreamsAction extends ActionType { + + public static final GetDataStreamsAction INSTANCE = new GetDataStreamsAction(); + public static final String NAME = "indices:admin/data_stream/get"; + + private GetDataStreamsAction() { + super(NAME, Response::new); + } + + public static class Request extends MasterNodeReadRequest { + + private final String[] names; + + public Request(String[] names) { + this.names = Objects.requireNonNull(names); + } + + @Override + public ActionRequestValidationException validate() { + return null; + } + + public Request(StreamInput in) throws IOException { + super(in); + this.names = in.readStringArray(); + } + + @Override + public void writeTo(StreamOutput out) throws IOException { + super.writeTo(out); + out.writeStringArray(names); + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + Request request = (Request) o; + return Arrays.equals(names, request.names); + } + + @Override + public int hashCode() { + return Arrays.hashCode(names); + } + } + + public static class Response extends ActionResponse implements ToXContentObject { + + private final List dataStreams; + + public Response(List dataStreams) { + this.dataStreams = dataStreams; + } + + public Response(StreamInput in) throws IOException { + this(in.readList(DataStream::new)); + } + + @Override + public void writeTo(StreamOutput out) throws IOException { + out.writeList(dataStreams); + } + + @Override + public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { + builder.startArray(); + for (DataStream dataStream : dataStreams) { + dataStream.toXContent(builder, params); + } + builder.endArray(); + return builder; + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + Response response = (Response) o; + return dataStreams.equals(response.dataStreams); + } + + @Override + public int hashCode() { + return Objects.hash(dataStreams); + } + } + + public static class TransportAction extends TransportMasterNodeReadAction { + + @Inject + public TransportAction(TransportService transportService, ClusterService clusterService, ThreadPool threadPool, + ActionFilters actionFilters, IndexNameExpressionResolver indexNameExpressionResolver) { + super(NAME, transportService, clusterService, threadPool, actionFilters, Request::new, indexNameExpressionResolver); + } + + @Override + protected String executor() { + return ThreadPool.Names.SAME; + } + + @Override + protected Response read(StreamInput in) throws IOException { + return new Response(in); + } + + @Override + protected void masterOperation(Task task, Request request, ClusterState state, + ActionListener listener) throws Exception { + listener.onResponse(new Response(getDataStreams(state, request))); + } + + static List getDataStreams(ClusterState clusterState, Request request) { + Map dataStreams = clusterState.metaData().dataStreams(); + + // return all data streams if no name was specified + if (request.names.length == 0) { + return new ArrayList<>(dataStreams.values()); + } + + final List results = new ArrayList<>(); + for (String name : request.names) { + if (Regex.isSimpleMatchPattern(name)) { + for (Map.Entry entry : dataStreams.entrySet()) { + if (Regex.simpleMatch(name, entry.getKey())) { + results.add(entry.getValue()); + } + } + } else if (dataStreams.containsKey(name)) { + results.add(dataStreams.get(name)); + } + } + return results; + } + + @Override + protected ClusterBlockException checkBlock(Request request, ClusterState state) { + return state.blocks().globalBlockedException(ClusterBlockLevel.METADATA_WRITE); + } + } + +} diff --git a/server/src/main/java/org/elasticsearch/action/admin/indices/shards/IndicesShardStoresRequest.java b/server/src/main/java/org/elasticsearch/action/admin/indices/shards/IndicesShardStoresRequest.java index 45a9c7283d0ab..19f707aa13367 100644 --- a/server/src/main/java/org/elasticsearch/action/admin/indices/shards/IndicesShardStoresRequest.java +++ b/server/src/main/java/org/elasticsearch/action/admin/indices/shards/IndicesShardStoresRequest.java @@ -55,7 +55,7 @@ public IndicesShardStoresRequest(StreamInput in) throws IOException { int nStatus = in.readVInt(); statuses = EnumSet.noneOf(ClusterHealthStatus.class); for (int i = 0; i < nStatus; i++) { - statuses.add(ClusterHealthStatus.fromValue(in.readByte())); + statuses.add(ClusterHealthStatus.readFrom(in)); } indicesOptions = IndicesOptions.readIndicesOptions(in); } diff --git a/server/src/main/java/org/elasticsearch/action/admin/indices/template/delete/DeleteComponentTemplateAction.java b/server/src/main/java/org/elasticsearch/action/admin/indices/template/delete/DeleteComponentTemplateAction.java new file mode 100644 index 0000000000000..a0dc94a92f14b --- /dev/null +++ b/server/src/main/java/org/elasticsearch/action/admin/indices/template/delete/DeleteComponentTemplateAction.java @@ -0,0 +1,90 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ + +package org.elasticsearch.action.admin.indices.template.delete; + +import org.elasticsearch.action.ActionRequestValidationException; +import org.elasticsearch.action.ActionType; +import org.elasticsearch.action.support.master.AcknowledgedResponse; +import org.elasticsearch.action.support.master.MasterNodeRequest; +import org.elasticsearch.common.io.stream.StreamInput; +import org.elasticsearch.common.io.stream.StreamOutput; + +import java.io.IOException; + +import static org.elasticsearch.action.ValidateActions.addValidationError; + +public class DeleteComponentTemplateAction extends ActionType { + + public static final DeleteComponentTemplateAction INSTANCE = new DeleteComponentTemplateAction(); + public static final String NAME = "cluster:admin/component_template/delete"; + + private DeleteComponentTemplateAction() { + super(NAME, AcknowledgedResponse::new); + } + + public static class Request extends MasterNodeRequest { + + private String name; + + public Request(StreamInput in) throws IOException { + super(in); + name = in.readString(); + } + + public Request() { } + + /** + * Constructs a new delete index request for the specified name. + */ + public Request(String name) { + this.name = name; + } + + /** + * Set the index template name to delete. + */ + public Request name(String name) { + this.name = name; + return this; + } + + @Override + public ActionRequestValidationException validate() { + ActionRequestValidationException validationException = null; + if (name == null) { + validationException = addValidationError("name is missing", validationException); + } + return validationException; + } + + /** + * The index template name to delete. + */ + public String name() { + return name; + } + + @Override + public void writeTo(StreamOutput out) throws IOException { + super.writeTo(out); + out.writeString(name); + } + } +} diff --git a/server/src/main/java/org/elasticsearch/action/admin/indices/template/delete/TransportDeleteComponentTemplateAction.java b/server/src/main/java/org/elasticsearch/action/admin/indices/template/delete/TransportDeleteComponentTemplateAction.java new file mode 100644 index 0000000000000..95b087868d31f --- /dev/null +++ b/server/src/main/java/org/elasticsearch/action/admin/indices/template/delete/TransportDeleteComponentTemplateAction.java @@ -0,0 +1,79 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ + +package org.elasticsearch.action.admin.indices.template.delete; + +import org.apache.logging.log4j.LogManager; +import org.apache.logging.log4j.Logger; +import org.elasticsearch.action.ActionListener; +import org.elasticsearch.action.support.ActionFilters; +import org.elasticsearch.action.support.master.AcknowledgedResponse; +import org.elasticsearch.action.support.master.TransportMasterNodeAction; +import org.elasticsearch.cluster.ClusterState; +import org.elasticsearch.cluster.block.ClusterBlockException; +import org.elasticsearch.cluster.block.ClusterBlockLevel; +import org.elasticsearch.cluster.metadata.IndexNameExpressionResolver; +import org.elasticsearch.cluster.metadata.MetaDataIndexTemplateService; +import org.elasticsearch.cluster.service.ClusterService; +import org.elasticsearch.common.inject.Inject; +import org.elasticsearch.common.io.stream.StreamInput; +import org.elasticsearch.tasks.Task; +import org.elasticsearch.threadpool.ThreadPool; +import org.elasticsearch.transport.TransportService; + +import java.io.IOException; + +public class TransportDeleteComponentTemplateAction + extends TransportMasterNodeAction { + + private static final Logger logger = LogManager.getLogger(TransportDeleteComponentTemplateAction.class); + + private final MetaDataIndexTemplateService indexTemplateService; + + @Inject + public TransportDeleteComponentTemplateAction(TransportService transportService, ClusterService clusterService, + ThreadPool threadPool, MetaDataIndexTemplateService indexTemplateService, + ActionFilters actionFilters, IndexNameExpressionResolver indexNameExpressionResolver) { + super(DeleteComponentTemplateAction.NAME, transportService, clusterService, threadPool, actionFilters, + DeleteComponentTemplateAction.Request::new, indexNameExpressionResolver); + this.indexTemplateService = indexTemplateService; + } + + @Override + protected String executor() { + // we go async right away + return ThreadPool.Names.SAME; + } + + @Override + protected AcknowledgedResponse read(StreamInput in) throws IOException { + return new AcknowledgedResponse(in); + } + + @Override + protected ClusterBlockException checkBlock(DeleteComponentTemplateAction.Request request, ClusterState state) { + return state.blocks().globalBlockedException(ClusterBlockLevel.METADATA_WRITE); + } + + @Override + protected void masterOperation(Task task, final DeleteComponentTemplateAction.Request request, final ClusterState state, + final ActionListener listener) { + indexTemplateService.removeComponentTemplate(request.name(), request.masterNodeTimeout(), listener); + } +} diff --git a/server/src/main/java/org/elasticsearch/action/admin/indices/template/get/GetComponentTemplateAction.java b/server/src/main/java/org/elasticsearch/action/admin/indices/template/get/GetComponentTemplateAction.java new file mode 100644 index 0000000000000..34242c1d6aa9d --- /dev/null +++ b/server/src/main/java/org/elasticsearch/action/admin/indices/template/get/GetComponentTemplateAction.java @@ -0,0 +1,171 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ + +package org.elasticsearch.action.admin.indices.template.get; + +import org.elasticsearch.action.ActionRequestValidationException; +import org.elasticsearch.action.ActionResponse; +import org.elasticsearch.action.ActionType; +import org.elasticsearch.action.support.master.MasterNodeReadRequest; +import org.elasticsearch.cluster.metadata.ComponentTemplate; +import org.elasticsearch.common.ParseField; +import org.elasticsearch.common.Strings; +import org.elasticsearch.common.io.stream.StreamInput; +import org.elasticsearch.common.io.stream.StreamOutput; +import org.elasticsearch.common.xcontent.ToXContentObject; +import org.elasticsearch.common.xcontent.XContentBuilder; + +import java.io.IOException; +import java.util.HashMap; +import java.util.Map; +import java.util.Objects; + +import static org.elasticsearch.action.ValidateActions.addValidationError; + +/** + * Action to retrieve one or more component templates + */ +public class GetComponentTemplateAction extends ActionType { + + public static final GetComponentTemplateAction INSTANCE = new GetComponentTemplateAction(); + public static final String NAME = "cluster:admin/component_template/get"; + + private GetComponentTemplateAction() { + super(NAME, GetComponentTemplateAction.Response::new); + } + + /** + * Request that to retrieve one or more component templates + */ + public static class Request extends MasterNodeReadRequest { + + private String[] names; + + public Request() { } + + public Request(String... names) { + this.names = names; + } + + public Request(StreamInput in) throws IOException { + super(in); + names = in.readStringArray(); + } + + @Override + public void writeTo(StreamOutput out) throws IOException { + super.writeTo(out); + out.writeStringArray(names); + } + + @Override + public ActionRequestValidationException validate() { + ActionRequestValidationException validationException = null; + if (names == null) { + validationException = addValidationError("names is null or empty", validationException); + } else { + for (String name : names) { + if (name == null || Strings.hasText(name) == false) { + validationException = addValidationError("name is missing", validationException); + } + } + } + return validationException; + } + + /** + * Sets the names of the component templates. + */ + public Request names(String... names) { + this.names = names; + return this; + } + + /** + * The names of the component templates. + */ + public String[] names() { + return this.names; + } + } + + public static class Response extends ActionResponse implements ToXContentObject { + public static final ParseField NAME = new ParseField("name"); + public static final ParseField COMPONENT_TEMPLATES = new ParseField("component_templates"); + public static final ParseField COMPONENT_TEMPLATE = new ParseField("component_template"); + + private final Map componentTemplates; + + public Response(StreamInput in) throws IOException { + super(in); + int size = in.readVInt(); + componentTemplates = new HashMap<>(); + for (int i = 0 ; i < size ; i++) { + componentTemplates.put(in.readString(), new ComponentTemplate(in)); + } + } + + public Response(Map componentTemplates) { + this.componentTemplates = componentTemplates; + } + + public Map getComponentTemplates() { + return componentTemplates; + } + + @Override + public void writeTo(StreamOutput out) throws IOException { + out.writeVInt(componentTemplates.size()); + for (Map.Entry componentTemplate : componentTemplates.entrySet()) { + out.writeString(componentTemplate.getKey()); + componentTemplate.getValue().writeTo(out); + } + } + + @Override + public boolean equals(Object o) { + if (this == o) return true; + if (o == null || getClass() != o.getClass()) return false; + Response that = (Response) o; + return Objects.equals(componentTemplates, that.componentTemplates); + } + + @Override + public int hashCode() { + return Objects.hash(componentTemplates); + } + + @Override + public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { + builder.startObject(); + builder.startArray(COMPONENT_TEMPLATES.getPreferredName()); + for (Map.Entry componentTemplate : this.componentTemplates.entrySet()) { + builder.startObject(); + builder.field(NAME.getPreferredName(), componentTemplate.getKey()); + builder.field(COMPONENT_TEMPLATE.getPreferredName(), componentTemplate.getValue()); + builder.endObject(); + } + builder.endArray(); + builder.endObject(); + return builder; + } + + } + +} diff --git a/server/src/main/java/org/elasticsearch/action/admin/indices/template/get/TransportGetComponentTemplateAction.java b/server/src/main/java/org/elasticsearch/action/admin/indices/template/get/TransportGetComponentTemplateAction.java new file mode 100644 index 0000000000000..dcb8ea57d01e3 --- /dev/null +++ b/server/src/main/java/org/elasticsearch/action/admin/indices/template/get/TransportGetComponentTemplateAction.java @@ -0,0 +1,94 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ + +package org.elasticsearch.action.admin.indices.template.get; + +import org.elasticsearch.action.ActionListener; +import org.elasticsearch.action.support.ActionFilters; +import org.elasticsearch.action.support.master.TransportMasterNodeReadAction; +import org.elasticsearch.cluster.ClusterState; +import org.elasticsearch.cluster.block.ClusterBlockException; +import org.elasticsearch.cluster.block.ClusterBlockLevel; +import org.elasticsearch.cluster.metadata.ComponentTemplate; +import org.elasticsearch.cluster.metadata.IndexNameExpressionResolver; +import org.elasticsearch.cluster.service.ClusterService; +import org.elasticsearch.common.inject.Inject; +import org.elasticsearch.common.io.stream.StreamInput; +import org.elasticsearch.common.regex.Regex; +import org.elasticsearch.tasks.Task; +import org.elasticsearch.threadpool.ThreadPool; +import org.elasticsearch.transport.TransportService; + +import java.io.IOException; +import java.util.HashMap; +import java.util.Map; + +public class TransportGetComponentTemplateAction extends + TransportMasterNodeReadAction { + + @Inject + public TransportGetComponentTemplateAction(TransportService transportService, ClusterService clusterService, + ThreadPool threadPool, ActionFilters actionFilters, + IndexNameExpressionResolver indexNameExpressionResolver) { + super(GetComponentTemplateAction.NAME, transportService, clusterService, threadPool, actionFilters, + GetComponentTemplateAction.Request::new, indexNameExpressionResolver); + } + + @Override + protected String executor() { + return ThreadPool.Names.SAME; + } + + @Override + protected GetComponentTemplateAction.Response read(StreamInput in) throws IOException { + return new GetComponentTemplateAction.Response(in); + } + + @Override + protected ClusterBlockException checkBlock(GetComponentTemplateAction.Request request, ClusterState state) { + return state.blocks().globalBlockedException(ClusterBlockLevel.METADATA_READ); + } + + @Override + protected void masterOperation(Task task, GetComponentTemplateAction.Request request, ClusterState state, + ActionListener listener) { + Map allTemplates = state.metaData().componentTemplates(); + + // If we did not ask for a specific name, then we return all templates + if (request.names().length == 0) { + listener.onResponse(new GetComponentTemplateAction.Response(allTemplates)); + return; + } + + final Map results = new HashMap<>(); + for (String name : request.names()) { + if (Regex.isSimpleMatchPattern(name)) { + for (Map.Entry entry : allTemplates.entrySet()) { + if (Regex.simpleMatch(name, entry.getKey())) { + results.put(entry.getKey(), entry.getValue()); + } + } + } else if (allTemplates.containsKey(name)) { + results.put(name, allTemplates.get(name)); + } + } + + listener.onResponse(new GetComponentTemplateAction.Response(results)); + } +} diff --git a/server/src/main/java/org/elasticsearch/action/admin/indices/template/put/PutComponentTemplateAction.java b/server/src/main/java/org/elasticsearch/action/admin/indices/template/put/PutComponentTemplateAction.java new file mode 100644 index 0000000000000..448be77793e5e --- /dev/null +++ b/server/src/main/java/org/elasticsearch/action/admin/indices/template/put/PutComponentTemplateAction.java @@ -0,0 +1,151 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ + +package org.elasticsearch.action.admin.indices.template.put; + +import org.elasticsearch.action.ActionRequestValidationException; +import org.elasticsearch.action.ActionType; +import org.elasticsearch.action.support.master.AcknowledgedResponse; +import org.elasticsearch.action.support.master.MasterNodeRequest; +import org.elasticsearch.cluster.metadata.ComponentTemplate; +import org.elasticsearch.common.Nullable; +import org.elasticsearch.common.Strings; +import org.elasticsearch.common.io.stream.StreamInput; +import org.elasticsearch.common.io.stream.StreamOutput; + +import java.io.IOException; + +import static org.elasticsearch.action.ValidateActions.addValidationError; + +/** + * An action for putting a single component template into the cluster state + */ +public class PutComponentTemplateAction extends ActionType { + + public static final PutComponentTemplateAction INSTANCE = new PutComponentTemplateAction(); + public static final String NAME = "cluster:admin/component_template/put"; + + private PutComponentTemplateAction() { + super(NAME, AcknowledgedResponse::new); + } + + /** + * A request for putting a single component template into the cluster state + */ + public static class Request extends MasterNodeRequest { + private final String name; + @Nullable + private String cause; + private boolean create; + private ComponentTemplate componentTemplate; + + public Request(StreamInput in) throws IOException { + super(in); + this.name = in.readString(); + this.cause = in.readOptionalString(); + this.create = in.readBoolean(); + this.componentTemplate = new ComponentTemplate(in); + } + + /** + * Constructs a new put component template request with the provided name. + */ + public Request(String name) { + this.name = name; + } + + @Override + public void writeTo(StreamOutput out) throws IOException { + super.writeTo(out); + out.writeString(name); + out.writeOptionalString(cause); + out.writeBoolean(create); + this.componentTemplate.writeTo(out); + } + + @Override + public ActionRequestValidationException validate() { + ActionRequestValidationException validationException = null; + if (name == null || Strings.hasText(name) == false) { + validationException = addValidationError("name is missing", validationException); + } + if (componentTemplate == null) { + validationException = addValidationError("a component template is required", validationException); + } + return validationException; + } + + /** + * The name of the index template. + */ + public String name() { + return this.name; + } + + /** + * Set to {@code true} to force only creation, not an update of an index template. If it already + * exists, it will fail with an {@link IllegalArgumentException}. + */ + public Request create(boolean create) { + this.create = create; + return this; + } + + public boolean create() { + return create; + } + + /** + * The cause for this index template creation. + */ + public Request cause(@Nullable String cause) { + this.cause = cause; + return this; + } + + @Nullable + public String cause() { + return this.cause; + } + + /** + * The component template that will be inserted into the cluster state + */ + public Request componentTemplate(ComponentTemplate template) { + this.componentTemplate = template; + return this; + } + + public ComponentTemplate componentTemplate() { + return this.componentTemplate; + } + + @Override + public String toString() { + StringBuilder sb = new StringBuilder("PutComponentRequest["); + sb.append("name=").append(name); + sb.append(", cause=").append(cause); + sb.append(", create=").append(create); + sb.append(", component_template=").append(componentTemplate); + sb.append("]"); + return sb.toString(); + } + } + +} diff --git a/server/src/main/java/org/elasticsearch/action/admin/indices/template/put/TransportPutComponentTemplateAction.java b/server/src/main/java/org/elasticsearch/action/admin/indices/template/put/TransportPutComponentTemplateAction.java new file mode 100644 index 0000000000000..bc2816c5614b5 --- /dev/null +++ b/server/src/main/java/org/elasticsearch/action/admin/indices/template/put/TransportPutComponentTemplateAction.java @@ -0,0 +1,94 @@ +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ + +package org.elasticsearch.action.admin.indices.template.put; + +import org.elasticsearch.action.ActionListener; +import org.elasticsearch.action.support.ActionFilters; +import org.elasticsearch.action.support.master.AcknowledgedResponse; +import org.elasticsearch.action.support.master.TransportMasterNodeAction; +import org.elasticsearch.cluster.ClusterState; +import org.elasticsearch.cluster.block.ClusterBlockException; +import org.elasticsearch.cluster.block.ClusterBlockLevel; +import org.elasticsearch.cluster.metadata.ComponentTemplate; +import org.elasticsearch.cluster.metadata.IndexMetaData; +import org.elasticsearch.cluster.metadata.IndexNameExpressionResolver; +import org.elasticsearch.cluster.metadata.MetaDataIndexTemplateService; +import org.elasticsearch.cluster.metadata.Template; +import org.elasticsearch.cluster.service.ClusterService; +import org.elasticsearch.common.inject.Inject; +import org.elasticsearch.common.io.stream.StreamInput; +import org.elasticsearch.common.settings.IndexScopedSettings; +import org.elasticsearch.common.settings.Settings; +import org.elasticsearch.tasks.Task; +import org.elasticsearch.threadpool.ThreadPool; +import org.elasticsearch.transport.TransportService; + +import java.io.IOException; + +public class TransportPutComponentTemplateAction + extends TransportMasterNodeAction { + + private final MetaDataIndexTemplateService indexTemplateService; + private final IndexScopedSettings indexScopedSettings; + + @Inject + public TransportPutComponentTemplateAction(TransportService transportService, ClusterService clusterService, + ThreadPool threadPool, MetaDataIndexTemplateService indexTemplateService, + ActionFilters actionFilters, IndexNameExpressionResolver indexNameExpressionResolver, + IndexScopedSettings indexScopedSettings) { + super(PutComponentTemplateAction.NAME, transportService, clusterService, threadPool, actionFilters, + PutComponentTemplateAction.Request::new, indexNameExpressionResolver); + this.indexTemplateService = indexTemplateService; + this.indexScopedSettings = indexScopedSettings; + } + + @Override + protected String executor() { + // we go async right away + return ThreadPool.Names.SAME; + } + + @Override + protected AcknowledgedResponse read(StreamInput in) throws IOException { + return new AcknowledgedResponse(in); + } + + @Override + protected ClusterBlockException checkBlock(PutComponentTemplateAction.Request request, ClusterState state) { + return state.blocks().globalBlockedException(ClusterBlockLevel.METADATA_WRITE); + } + + @Override + protected void masterOperation(Task task, final PutComponentTemplateAction.Request request, final ClusterState state, + final ActionListener listener) { + ComponentTemplate componentTemplate = request.componentTemplate(); + Template template = componentTemplate.template(); + // Normalize the index settings if necessary + if (template.settings() != null) { + Settings.Builder builder = Settings.builder().put(template.settings()).normalizePrefix(IndexMetaData.INDEX_SETTING_PREFIX); + Settings settings = builder.build(); + indexScopedSettings.validate(settings, true); + template = new Template(settings, template.mappings(), template.aliases()); + componentTemplate = new ComponentTemplate(template, componentTemplate.version(), componentTemplate.metadata()); + } + indexTemplateService.putComponentTemplate(request.cause(), request.create(), request.name(), request.masterNodeTimeout(), + componentTemplate, listener); + } +} diff --git a/server/src/main/java/org/elasticsearch/action/search/SearchPhaseController.java b/server/src/main/java/org/elasticsearch/action/search/SearchPhaseController.java index f2cf6b199dcbe..9648f74e96b27 100644 --- a/server/src/main/java/org/elasticsearch/action/search/SearchPhaseController.java +++ b/server/src/main/java/org/elasticsearch/action/search/SearchPhaseController.java @@ -19,8 +19,17 @@ package org.elasticsearch.action.search; -import com.carrotsearch.hppc.IntArrayList; -import com.carrotsearch.hppc.ObjectObjectHashMap; +import java.util.ArrayList; +import java.util.Arrays; +import java.util.Collection; +import java.util.Collections; +import java.util.HashMap; +import java.util.List; +import java.util.Map; +import java.util.function.Function; +import java.util.function.IntFunction; +import java.util.function.Supplier; +import java.util.stream.Collectors; import org.apache.lucene.index.Term; import org.apache.lucene.search.CollectionStatistics; @@ -58,16 +67,8 @@ import org.elasticsearch.search.suggest.Suggest.Suggestion; import org.elasticsearch.search.suggest.completion.CompletionSuggestion; -import java.util.ArrayList; -import java.util.Arrays; -import java.util.Collection; -import java.util.Collections; -import java.util.HashMap; -import java.util.List; -import java.util.Map; -import java.util.function.Function; -import java.util.function.IntFunction; -import java.util.stream.Collectors; +import com.carrotsearch.hppc.IntArrayList; +import com.carrotsearch.hppc.ObjectObjectHashMap; public final class SearchPhaseController { private static final ScoreDoc[] EMPTY_DOCS = new ScoreDoc[0]; @@ -429,7 +430,7 @@ public ReducedQueryPhase reducedQueryPhase(Collection queryResults, - List bufferedAggs, List bufferedTopDocs, + List> bufferedAggs, List bufferedTopDocs, TopDocsStats topDocsStats, int numReducePhases, boolean isScrollRequest, InternalAggregation.ReduceContextBuilder aggReduceContextBuilder, boolean performFinalReduce) { @@ -453,7 +454,7 @@ private ReducedQueryPhase reducedQueryPhase(Collection aggregationsList; + final List> aggregationsList; if (bufferedAggs != null) { consumeAggs = false; // we already have results from intermediate reduces and just need to perform the final reduce @@ -492,7 +493,7 @@ private ReducedQueryPhase reducedQueryPhase(Collection> aggregationsList + ) { + /* + * Parse the aggregations, clearing the list as we go so bits backing + * the DelayedWriteable can be collected immediately. + */ + List toReduce = new ArrayList<>(aggregationsList.size()); + for (int i = 0; i < aggregationsList.size(); i++) { + toReduce.add(aggregationsList.get(i).get()); + aggregationsList.set(i, null); + } + return aggregationsList.isEmpty() ? null : InternalAggregations.topLevelReduce(toReduce, + performFinalReduce ? aggReduceContextBuilder.forFinalReduction() : aggReduceContextBuilder.forPartialReduction()); + } + /* * Returns the size of the requested top documents (from + size) */ @@ -600,7 +618,7 @@ public InternalSearchResponse buildResponse(SearchHits hits) { */ static final class QueryPhaseResultConsumer extends ArraySearchPhaseResults { private final SearchShardTarget[] processedShards; - private final InternalAggregations[] aggsBuffer; + private final Supplier[] aggsBuffer; private final TopDocs[] topDocsBuffer; private final boolean hasAggs; private final boolean hasTopDocs; @@ -642,7 +660,9 @@ private QueryPhaseResultConsumer(SearchProgressListener progressListener, Search this.progressListener = progressListener; this.processedShards = new SearchShardTarget[expectedResultSize]; // no need to buffer anything if we have less expected results. in this case we don't consume any results ahead of time. - this.aggsBuffer = new InternalAggregations[hasAggs ? bufferSize : 0]; + @SuppressWarnings("unchecked") + Supplier[] aggsBuffer = new Supplier[hasAggs ? bufferSize : 0]; + this.aggsBuffer = aggsBuffer; this.topDocsBuffer = new TopDocs[hasTopDocs ? bufferSize : 0]; this.hasTopDocs = hasTopDocs; this.hasAggs = hasAggs; @@ -665,10 +685,14 @@ private synchronized void consumeInternal(QuerySearchResult querySearchResult) { if (querySearchResult.isNull() == false) { if (index == bufferSize) { if (hasAggs) { - ReduceContext reduceContext = aggReduceContextBuilder.forPartialReduction(); - InternalAggregations reducedAggs = InternalAggregations.topLevelReduce(Arrays.asList(aggsBuffer), reduceContext); - Arrays.fill(aggsBuffer, null); - aggsBuffer[0] = reducedAggs; + List aggs = new ArrayList<>(aggsBuffer.length); + for (int i = 0; i < aggsBuffer.length; i++) { + aggs.add(aggsBuffer[i].get()); + aggsBuffer[i] = null; // null the buffer so it can be GCed now. + } + InternalAggregations reducedAggs = InternalAggregations.topLevelReduce( + aggs, aggReduceContextBuilder.forPartialReduction()); + aggsBuffer[0] = () -> reducedAggs; } if (hasTopDocs) { TopDocs reducedTopDocs = mergeTopDocs(Arrays.asList(topDocsBuffer), @@ -681,12 +705,12 @@ private synchronized void consumeInternal(QuerySearchResult querySearchResult) { index = 1; if (hasAggs || hasTopDocs) { progressListener.notifyPartialReduce(SearchProgressListener.buildSearchShards(processedShards), - topDocsStats.getTotalHits(), hasAggs ? aggsBuffer[0] : null, numReducePhases); + topDocsStats.getTotalHits(), hasAggs ? aggsBuffer[0].get() : null, numReducePhases); } } final int i = index++; if (hasAggs) { - aggsBuffer[i] = (InternalAggregations) querySearchResult.consumeAggs(); + aggsBuffer[i] = querySearchResult.consumeAggs(); } if (hasTopDocs) { final TopDocsAndMaxScore topDocs = querySearchResult.consumeTopDocs(); // can't be null @@ -698,7 +722,7 @@ private synchronized void consumeInternal(QuerySearchResult querySearchResult) { processedShards[querySearchResult.getShardIndex()] = querySearchResult.getSearchShardTarget(); } - private synchronized List getRemainingAggs() { + private synchronized List> getRemainingAggs() { return hasAggs ? Arrays.asList(aggsBuffer).subList(0, index) : null; } diff --git a/server/src/main/java/org/elasticsearch/action/search/SearchQueryThenFetchAsyncAction.java b/server/src/main/java/org/elasticsearch/action/search/SearchQueryThenFetchAsyncAction.java index e42d8405da5b0..e8e864ddd1b47 100644 --- a/server/src/main/java/org/elasticsearch/action/search/SearchQueryThenFetchAsyncAction.java +++ b/server/src/main/java/org/elasticsearch/action/search/SearchQueryThenFetchAsyncAction.java @@ -88,7 +88,10 @@ protected void onShardGroupFailure(int shardIndex, SearchShardTarget shardTarget @Override protected void onShardResult(SearchPhaseResult result, SearchShardIterator shardIt) { QuerySearchResult queryResult = result.queryResult(); - if (queryResult.isNull() == false && queryResult.topDocs().topDocs instanceof TopFieldDocs) { + if (queryResult.isNull() == false + // disable sort optims for scroll requests because they keep track of the last bottom doc locally (per shard) + && getRequest().scroll() == null + && queryResult.topDocs().topDocs instanceof TopFieldDocs) { TopFieldDocs topDocs = (TopFieldDocs) queryResult.topDocs().topDocs; if (bottomSortCollector == null) { synchronized (this) { diff --git a/server/src/main/java/org/elasticsearch/action/search/SearchRequest.java b/server/src/main/java/org/elasticsearch/action/search/SearchRequest.java index 761a50bfa4168..1c01da64d45f1 100644 --- a/server/src/main/java/org/elasticsearch/action/search/SearchRequest.java +++ b/server/src/main/java/org/elasticsearch/action/search/SearchRequest.java @@ -89,7 +89,7 @@ public class SearchRequest extends ActionRequest implements IndicesRequest.Repla private int maxConcurrentShardRequests = 0; - private int preFilterShardSize = DEFAULT_PRE_FILTER_SHARD_SIZE; + private Integer preFilterShardSize; private boolean ccsMinimizeRoundtrips = true; @@ -201,7 +201,11 @@ public SearchRequest(StreamInput in) throws IOException { requestCache = in.readOptionalBoolean(); batchedReduceSize = in.readVInt(); maxConcurrentShardRequests = in.readVInt(); - preFilterShardSize = in.readVInt(); + if (in.getVersion().onOrAfter(Version.V_7_7_0)) { + preFilterShardSize = in.readOptionalVInt(); + } else { + preFilterShardSize = in.readVInt(); + } allowPartialSearchResults = in.readOptionalBoolean(); localClusterAlias = in.readOptionalString(); if (localClusterAlias != null) { @@ -231,7 +235,11 @@ public void writeTo(StreamOutput out) throws IOException { out.writeOptionalBoolean(requestCache); out.writeVInt(batchedReduceSize); out.writeVInt(maxConcurrentShardRequests); - out.writeVInt(preFilterShardSize); + if (out.getVersion().onOrAfter(Version.V_7_7_0)) { + out.writeOptionalVInt(preFilterShardSize); + } else { + out.writeVInt(preFilterShardSize == null ? DEFAULT_BATCHED_REDUCE_SIZE : preFilterShardSize); + } out.writeOptionalBoolean(allowPartialSearchResults); out.writeOptionalString(localClusterAlias); if (localClusterAlias != null) { @@ -269,6 +277,11 @@ public ActionRequestValidationException validate() { addValidationError("[request_cache] cannot be used in a scroll context", validationException); } } + if (source != null) { + if (source.aggregations() != null) { + validationException = source.aggregations().validate(validationException); + } + } return validationException; } @@ -531,8 +544,15 @@ public void setMaxConcurrentShardRequests(int maxConcurrentShardRequests) { /** * Sets a threshold that enforces a pre-filter roundtrip to pre-filter search shards based on query rewriting if the number of shards * the search request expands to exceeds the threshold. This filter roundtrip can limit the number of shards significantly if for - * instance a shard can not match any documents based on it's rewrite method ie. if date filters are mandatory to match but the shard - * bounds and the query are disjoint. The default is {@code 128} + * instance a shard can not match any documents based on its rewrite method ie. if date filters are mandatory to match but the shard + * bounds and the query are disjoint. + * + * When unspecified, the pre-filter phase is executed if any of these conditions is met: + *

    + *
  • The request targets more than 128 shards
  • + *
  • The request targets one or more read-only index
  • + *
  • The primary sort of the query targets an indexed field
  • + *
*/ public void setPreFilterShardSize(int preFilterShardSize) { if (preFilterShardSize < 1) { @@ -543,11 +563,20 @@ public void setPreFilterShardSize(int preFilterShardSize) { /** * Returns a threshold that enforces a pre-filter roundtrip to pre-filter search shards based on query rewriting if the number of shards - * the search request expands to exceeds the threshold. This filter roundtrip can limit the number of shards significantly if for - * instance a shard can not match any documents based on it's rewrite method ie. if date filters are mandatory to match but the shard - * bounds and the query are disjoint. The default is {@code 128} + * the search request expands to exceeds the threshold, or null if the threshold is unspecified. + * This filter roundtrip can limit the number of shards significantly if for + * instance a shard can not match any documents based on its rewrite method ie. if date filters are mandatory to match but the shard + * bounds and the query are disjoint. + * + * When unspecified, the pre-filter phase is executed if any of these conditions is met: + *
    + *
  • The request targets more than 128 shards
  • + *
  • The request targets one or more read-only index
  • + *
  • The primary sort of the query targets an indexed field
  • + *
*/ - public int getPreFilterShardSize() { + @Nullable + public Integer getPreFilterShardSize() { return preFilterShardSize; } diff --git a/server/src/main/java/org/elasticsearch/action/search/SearchRequestBuilder.java b/server/src/main/java/org/elasticsearch/action/search/SearchRequestBuilder.java index b8b791360d30c..edb2f093b3ee0 100644 --- a/server/src/main/java/org/elasticsearch/action/search/SearchRequestBuilder.java +++ b/server/src/main/java/org/elasticsearch/action/search/SearchRequestBuilder.java @@ -558,8 +558,15 @@ public SearchRequestBuilder setMaxConcurrentShardRequests(int maxConcurrentShard /** * Sets a threshold that enforces a pre-filter roundtrip to pre-filter search shards based on query rewriting if the number of shards * the search request expands to exceeds the threshold. This filter roundtrip can limit the number of shards significantly if for - * instance a shard can not match any documents based on it's rewrite method ie. if date filters are mandatory to match but the shard - * bounds and the query are disjoint. The default is {@code 128} + * instance a shard can not match any documents based on its rewrite method ie. if date filters are mandatory to match but the shard + * bounds and the query are disjoint. + * + * When unspecified, the pre-filter phase is executed if any of these conditions is met: + *
    + *
  • The request targets more than 128 shards
  • + *
  • The request targets one or more read-only index
  • + *
  • The primary sort of the query targets an indexed field
  • + *
*/ public SearchRequestBuilder setPreFilterShardSize(int preFilterShardSize) { this.request.setPreFilterShardSize(preFilterShardSize); diff --git a/server/src/main/java/org/elasticsearch/action/search/SearchResponse.java b/server/src/main/java/org/elasticsearch/action/search/SearchResponse.java index 81d61f2996ef4..80487710729c9 100644 --- a/server/src/main/java/org/elasticsearch/action/search/SearchResponse.java +++ b/server/src/main/java/org/elasticsearch/action/search/SearchResponse.java @@ -260,7 +260,7 @@ public static SearchResponse fromXContent(XContentParser parser) throws IOExcept return innerFromXContent(parser); } - static SearchResponse innerFromXContent(XContentParser parser) throws IOException { + public static SearchResponse innerFromXContent(XContentParser parser) throws IOException { ensureExpectedToken(Token.FIELD_NAME, parser.currentToken(), parser::getTokenLocation); String currentFieldName = parser.currentName(); SearchHits hits = null; @@ -413,9 +413,7 @@ public Clusters(int total, int successful, int skipped) { } private Clusters(StreamInput in) throws IOException { - this.total = in.readVInt(); - this.successful = in.readVInt(); - this.skipped = in.readVInt(); + this(in.readVInt(), in.readVInt(), in.readVInt()); } @Override @@ -427,7 +425,7 @@ public void writeTo(StreamOutput out) throws IOException { @Override public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { - if (this != EMPTY) { + if (total > 0) { builder.startObject(_CLUSTERS_FIELD.getPreferredName()); builder.field(TOTAL_FIELD.getPreferredName(), total); builder.field(SUCCESSFUL_FIELD.getPreferredName(), successful); diff --git a/server/src/main/java/org/elasticsearch/action/search/TransportSearchAction.java b/server/src/main/java/org/elasticsearch/action/search/TransportSearchAction.java index 61bbaa0a5097c..26d7324ebfbf6 100644 --- a/server/src/main/java/org/elasticsearch/action/search/TransportSearchAction.java +++ b/server/src/main/java/org/elasticsearch/action/search/TransportSearchAction.java @@ -29,6 +29,7 @@ import org.elasticsearch.action.support.IndicesOptions; import org.elasticsearch.client.Client; import org.elasticsearch.cluster.ClusterState; +import org.elasticsearch.cluster.block.ClusterBlockException; import org.elasticsearch.cluster.block.ClusterBlockLevel; import org.elasticsearch.cluster.metadata.IndexNameExpressionResolver; import org.elasticsearch.cluster.node.DiscoveryNode; @@ -55,7 +56,6 @@ import org.elasticsearch.search.internal.SearchContext; import org.elasticsearch.search.profile.ProfileShardResult; import org.elasticsearch.search.profile.SearchProfileShardResults; -import org.elasticsearch.search.sort.FieldSortBuilder; import org.elasticsearch.tasks.Task; import org.elasticsearch.threadpool.ThreadPool; import org.elasticsearch.transport.RemoteClusterAware; @@ -83,6 +83,7 @@ import static org.elasticsearch.action.search.SearchType.DFS_QUERY_THEN_FETCH; import static org.elasticsearch.action.search.SearchType.QUERY_THEN_FETCH; +import static org.elasticsearch.search.sort.FieldSortBuilder.hasPrimaryFieldSort; public class TransportSearchAction extends HandledTransportAction { @@ -512,7 +513,7 @@ private void executeSearch(SearchTask task, SearchTimeProvider timeProvider, Sea final DiscoveryNodes nodes = clusterState.nodes(); BiFunction connectionLookup = buildConnectionLookup(searchRequest.getLocalClusterAlias(), nodes::get, remoteConnections, searchTransportService::getConnection); - boolean preFilterSearchShards = shouldPreFilterSearchShards(searchRequest, shardIterators); + boolean preFilterSearchShards = shouldPreFilterSearchShards(clusterState, searchRequest, indices, shardIterators.size()); searchAsyncAction(task, searchRequest, shardIterators, timeProvider, connectionLookup, clusterState, Collections.unmodifiableMap(aliasFilter), concreteIndexBoosts, routingMap, listener, preFilterSearchShards, clusters).start(); } @@ -539,12 +540,31 @@ static BiFunction buildConnectionLookup(St }; } - private static boolean shouldPreFilterSearchShards(SearchRequest searchRequest, - GroupShardsIterator shardIterators) { + static boolean shouldPreFilterSearchShards(ClusterState clusterState, + SearchRequest searchRequest, + Index[] indices, + int numShards) { SearchSourceBuilder source = searchRequest.source(); + Integer preFilterShardSize = searchRequest.getPreFilterShardSize(); + if (preFilterShardSize == null + && (hasReadOnlyIndices(indices, clusterState) || hasPrimaryFieldSort(source))) { + preFilterShardSize = 1; + } else if (preFilterShardSize == null) { + preFilterShardSize = SearchRequest.DEFAULT_PRE_FILTER_SHARD_SIZE; + } return searchRequest.searchType() == QUERY_THEN_FETCH // we can't do this for DFS it needs to fan out to all shards all the time - && (SearchService.canRewriteToMatchNone(source) || FieldSortBuilder.hasPrimaryFieldSort(source)) - && searchRequest.getPreFilterShardSize() < shardIterators.size(); + && (SearchService.canRewriteToMatchNone(source) || hasPrimaryFieldSort(source)) + && preFilterShardSize < numShards; + } + + private static boolean hasReadOnlyIndices(Index[] indices, ClusterState clusterState) { + for (Index index : indices) { + ClusterBlockException writeBlock = clusterState.blocks().indexBlockedException(ClusterBlockLevel.WRITE, index.getName()); + if (writeBlock != null) { + return true; + } + } + return false; } static GroupShardsIterator mergeShardsIterators(GroupShardsIterator localShardsIterator, diff --git a/server/src/main/java/org/elasticsearch/client/IndicesAdminClient.java b/server/src/main/java/org/elasticsearch/client/IndicesAdminClient.java index 36b34a7b24c85..13f28dbbce210 100644 --- a/server/src/main/java/org/elasticsearch/client/IndicesAdminClient.java +++ b/server/src/main/java/org/elasticsearch/client/IndicesAdminClient.java @@ -21,6 +21,9 @@ import org.elasticsearch.action.ActionFuture; import org.elasticsearch.action.ActionListener; +import org.elasticsearch.action.admin.indices.datastream.DeleteDataStreamAction; +import org.elasticsearch.action.admin.indices.datastream.GetDataStreamsAction; +import org.elasticsearch.action.admin.indices.datastream.CreateDataStreamAction; import org.elasticsearch.action.admin.indices.alias.IndicesAliasesRequest; import org.elasticsearch.action.admin.indices.alias.IndicesAliasesRequestBuilder; import org.elasticsearch.action.admin.indices.alias.get.GetAliasesRequest; @@ -713,4 +716,33 @@ public interface IndicesAdminClient extends ElasticsearchClient { */ void rolloverIndex(RolloverRequest request, ActionListener listener); + /** + * Store a data stream + */ + void createDataStream(CreateDataStreamAction.Request request, ActionListener listener); + + /** + * Store a data stream + */ + ActionFuture createDataStream(CreateDataStreamAction.Request request); + + /** + * Delete a data stream + */ + void deleteDataStream(DeleteDataStreamAction.Request request, ActionListener listener); + + /** + * Delete a data stream + */ + ActionFuture deleteDataStream(DeleteDataStreamAction.Request request); + + /** + * Get data streams + */ + void getDataStreams(GetDataStreamsAction.Request request, ActionListener listener); + + /** + * Get data streams + */ + ActionFuture getDataStreams(GetDataStreamsAction.Request request); } diff --git a/server/src/main/java/org/elasticsearch/client/support/AbstractClient.java b/server/src/main/java/org/elasticsearch/client/support/AbstractClient.java index 1ee480fb55edd..669df02de212b 100644 --- a/server/src/main/java/org/elasticsearch/client/support/AbstractClient.java +++ b/server/src/main/java/org/elasticsearch/client/support/AbstractClient.java @@ -30,6 +30,9 @@ import org.elasticsearch.action.admin.cluster.allocation.ClusterAllocationExplainRequest; import org.elasticsearch.action.admin.cluster.allocation.ClusterAllocationExplainRequestBuilder; import org.elasticsearch.action.admin.cluster.allocation.ClusterAllocationExplainResponse; +import org.elasticsearch.action.admin.indices.datastream.DeleteDataStreamAction; +import org.elasticsearch.action.admin.indices.datastream.GetDataStreamsAction; +import org.elasticsearch.action.admin.indices.datastream.CreateDataStreamAction; import org.elasticsearch.action.admin.cluster.health.ClusterHealthAction; import org.elasticsearch.action.admin.cluster.health.ClusterHealthRequest; import org.elasticsearch.action.admin.cluster.health.ClusterHealthRequestBuilder; @@ -1657,6 +1660,36 @@ public ActionFuture getSettings(GetSettingsRequest request) public void getSettings(GetSettingsRequest request, ActionListener listener) { execute(GetSettingsAction.INSTANCE, request, listener); } + + @Override + public void createDataStream(CreateDataStreamAction.Request request, ActionListener listener) { + execute(CreateDataStreamAction.INSTANCE, request, listener); + } + + @Override + public ActionFuture createDataStream(CreateDataStreamAction.Request request) { + return execute(CreateDataStreamAction.INSTANCE, request); + } + + @Override + public void deleteDataStream(DeleteDataStreamAction.Request request, ActionListener listener) { + execute(DeleteDataStreamAction.INSTANCE, request, listener); + } + + @Override + public ActionFuture deleteDataStream(DeleteDataStreamAction.Request request) { + return execute(DeleteDataStreamAction.INSTANCE, request); + } + + @Override + public void getDataStreams(GetDataStreamsAction.Request request, ActionListener listener) { + execute(GetDataStreamsAction.INSTANCE, request, listener); + } + + @Override + public ActionFuture getDataStreams(GetDataStreamsAction.Request request) { + return execute(GetDataStreamsAction.INSTANCE, request); + } } @Override diff --git a/server/src/main/java/org/elasticsearch/cluster/ClusterModule.java b/server/src/main/java/org/elasticsearch/cluster/ClusterModule.java index 70138ecc981bc..6dfe07e5c20aa 100644 --- a/server/src/main/java/org/elasticsearch/cluster/ClusterModule.java +++ b/server/src/main/java/org/elasticsearch/cluster/ClusterModule.java @@ -23,8 +23,10 @@ import org.elasticsearch.cluster.action.index.NodeMappingRefreshAction; import org.elasticsearch.cluster.action.shard.ShardStateAction; import org.elasticsearch.cluster.metadata.ComponentTemplateMetadata; +import org.elasticsearch.cluster.metadata.DataStreamMetadata; import org.elasticsearch.cluster.metadata.IndexGraveyard; import org.elasticsearch.cluster.metadata.IndexNameExpressionResolver; +import org.elasticsearch.cluster.metadata.IndexTemplateV2Metadata; import org.elasticsearch.cluster.metadata.MetaData; import org.elasticsearch.cluster.metadata.MetaDataDeleteIndexService; import org.elasticsearch.cluster.metadata.MetaDataIndexAliasesService; @@ -130,6 +132,9 @@ public static List getNamedWriteables() { PersistentTasksCustomMetaData::readDiffFrom); registerMetaDataCustom(entries, ComponentTemplateMetadata.TYPE, ComponentTemplateMetadata::new, ComponentTemplateMetadata::readDiffFrom); + registerMetaDataCustom(entries, IndexTemplateV2Metadata.TYPE, IndexTemplateV2Metadata::new, + IndexTemplateV2Metadata::readDiffFrom); + registerMetaDataCustom(entries, DataStreamMetadata.TYPE, DataStreamMetadata::new, DataStreamMetadata::readDiffFrom); // Task Status (not Diffable) entries.add(new Entry(Task.Status.class, PersistentTasksNodeService.Status.NAME, PersistentTasksNodeService.Status::new)); return entries; @@ -150,6 +155,10 @@ public static List getNamedXWriteables() { PersistentTasksCustomMetaData::fromXContent)); entries.add(new NamedXContentRegistry.Entry(MetaData.Custom.class, new ParseField(ComponentTemplateMetadata.TYPE), ComponentTemplateMetadata::fromXContent)); + entries.add(new NamedXContentRegistry.Entry(MetaData.Custom.class, new ParseField(IndexTemplateV2Metadata.TYPE), + IndexTemplateV2Metadata::fromXContent)); + entries.add(new NamedXContentRegistry.Entry(MetaData.Custom.class, new ParseField(DataStreamMetadata.TYPE), + DataStreamMetadata::fromXContent)); return entries; } diff --git a/server/src/main/java/org/elasticsearch/cluster/coordination/ClusterFormationFailureHelper.java b/server/src/main/java/org/elasticsearch/cluster/coordination/ClusterFormationFailureHelper.java index c249458dd23cf..e68af52eaed38 100644 --- a/server/src/main/java/org/elasticsearch/cluster/coordination/ClusterFormationFailureHelper.java +++ b/server/src/main/java/org/elasticsearch/cluster/coordination/ClusterFormationFailureHelper.java @@ -30,6 +30,7 @@ import org.elasticsearch.common.transport.TransportAddress; import org.elasticsearch.common.unit.TimeValue; import org.elasticsearch.common.util.concurrent.AbstractRunnable; +import org.elasticsearch.gateway.GatewayMetaState; import org.elasticsearch.threadpool.ThreadPool; import org.elasticsearch.threadpool.ThreadPool.Names; @@ -206,7 +207,12 @@ private String describeQuorum(VotingConfiguration votingConfiguration) { assert requiredNodes <= realNodeIds.size() : nodeIds; if (nodeIds.size() == 1) { - return "a node with id " + realNodeIds; + if (nodeIds.contains(GatewayMetaState.STALE_STATE_CONFIG_NODE_ID)) { + return "one or more nodes that have already participated as master-eligible nodes in the cluster but this node was " + + "not master-eligible the last time it joined the cluster"; + } else { + return "a node with id " + realNodeIds; + } } else if (nodeIds.size() == 2) { return "two nodes with ids " + realNodeIds; } else { diff --git a/server/src/main/java/org/elasticsearch/cluster/coordination/PublicationTransportHandler.java b/server/src/main/java/org/elasticsearch/cluster/coordination/PublicationTransportHandler.java index a4b2f509b2ed8..f5c325d3915d4 100644 --- a/server/src/main/java/org/elasticsearch/cluster/coordination/PublicationTransportHandler.java +++ b/server/src/main/java/org/elasticsearch/cluster/coordination/PublicationTransportHandler.java @@ -161,6 +161,7 @@ public PublicationContext newPublicationContext(ClusterChangedEvent clusterChang public void sendPublishRequest(DiscoveryNode destination, PublishRequest publishRequest, ActionListener originalListener) { assert publishRequest.getAcceptedState() == clusterChangedEvent.state() : "state got switched on us"; + assert transportService.getThreadPool().getThreadContext().isSystemContext(); final ActionListener responseActionListener; if (destination.equals(nodes.getLocalNode())) { // if publishing to self, use original request instead (see currentPublishRequestToSelf for explanation) @@ -197,6 +198,7 @@ public void onFailure(Exception e) { @Override public void sendApplyCommit(DiscoveryNode destination, ApplyCommitRequest applyCommitRequest, ActionListener responseActionListener) { + assert transportService.getThreadPool().getThreadContext().isSystemContext(); transportService.sendRequest(destination, COMMIT_STATE_ACTION_NAME, applyCommitRequest, stateRequestOptions, new TransportResponseHandler() { @@ -319,8 +321,8 @@ private void sendClusterStateDiff(ClusterState clusterState, public static BytesReference serializeFullClusterState(ClusterState clusterState, Version nodeVersion) throws IOException { final BytesStreamOutput bStream = new BytesStreamOutput(); - bStream.setVersion(nodeVersion); try (StreamOutput stream = CompressorFactory.COMPRESSOR.streamOutput(bStream)) { + stream.setVersion(nodeVersion); stream.writeBoolean(true); clusterState.writeTo(stream); } @@ -329,8 +331,8 @@ public static BytesReference serializeFullClusterState(ClusterState clusterState public static BytesReference serializeDiffClusterState(Diff diff, Version nodeVersion) throws IOException { final BytesStreamOutput bStream = new BytesStreamOutput(); - bStream.setVersion(nodeVersion); try (StreamOutput stream = CompressorFactory.COMPRESSOR.streamOutput(bStream)) { + stream.setVersion(nodeVersion); stream.writeBoolean(false); diff.writeTo(stream); } @@ -340,12 +342,12 @@ public static BytesReference serializeDiffClusterState(Diff diff, Version nodeVe private PublishWithJoinResponse handleIncomingPublishRequest(BytesTransportRequest request) throws IOException { final Compressor compressor = CompressorFactory.compressor(request.bytes()); StreamInput in = request.bytes().streamInput(); - in.setVersion(request.version()); try { if (compressor != null) { in = compressor.streamInput(in); } in = new NamedWriteableAwareStreamInput(in, namedWriteableRegistry); + in.setVersion(request.version()); // If true we received full cluster state - otherwise diffs if (in.readBoolean()) { final ClusterState incomingState; diff --git a/server/src/main/java/org/elasticsearch/cluster/health/ClusterHealthStatus.java b/server/src/main/java/org/elasticsearch/cluster/health/ClusterHealthStatus.java index 8a255fb1ce816..c0738ee82226a 100644 --- a/server/src/main/java/org/elasticsearch/cluster/health/ClusterHealthStatus.java +++ b/server/src/main/java/org/elasticsearch/cluster/health/ClusterHealthStatus.java @@ -52,10 +52,7 @@ public void writeTo(StreamOutput out) throws IOException { * @throws IllegalArgumentException if the value is unrecognized */ public static ClusterHealthStatus readFrom(StreamInput in) throws IOException { - return fromValue(in.readByte()); - } - - public static ClusterHealthStatus fromValue(byte value) throws IOException { + byte value = in.readByte(); switch (value) { case 0: return GREEN; diff --git a/server/src/main/java/org/elasticsearch/cluster/health/ClusterIndexHealth.java b/server/src/main/java/org/elasticsearch/cluster/health/ClusterIndexHealth.java index c1a52f2ffc548..1ee111b1f0e1a 100644 --- a/server/src/main/java/org/elasticsearch/cluster/health/ClusterIndexHealth.java +++ b/server/src/main/java/org/elasticsearch/cluster/health/ClusterIndexHealth.java @@ -165,7 +165,7 @@ public ClusterIndexHealth(final StreamInput in) throws IOException { relocatingShards = in.readVInt(); initializingShards = in.readVInt(); unassignedShards = in.readVInt(); - status = ClusterHealthStatus.fromValue(in.readByte()); + status = ClusterHealthStatus.readFrom(in); int size = in.readVInt(); shards = new HashMap<>(size); diff --git a/server/src/main/java/org/elasticsearch/cluster/health/ClusterShardHealth.java b/server/src/main/java/org/elasticsearch/cluster/health/ClusterShardHealth.java index 1d3a3dcee7b95..a96aedb023a4e 100644 --- a/server/src/main/java/org/elasticsearch/cluster/health/ClusterShardHealth.java +++ b/server/src/main/java/org/elasticsearch/cluster/health/ClusterShardHealth.java @@ -121,7 +121,7 @@ public ClusterShardHealth(final int shardId, final IndexShardRoutingTable shardR public ClusterShardHealth(final StreamInput in) throws IOException { shardId = in.readVInt(); - status = ClusterHealthStatus.fromValue(in.readByte()); + status = ClusterHealthStatus.readFrom(in); activeShards = in.readVInt(); relocatingShards = in.readVInt(); initializingShards = in.readVInt(); diff --git a/server/src/main/java/org/elasticsearch/cluster/health/ClusterStateHealth.java b/server/src/main/java/org/elasticsearch/cluster/health/ClusterStateHealth.java index ad1561e4f0eda..dbaefb359e073 100644 --- a/server/src/main/java/org/elasticsearch/cluster/health/ClusterStateHealth.java +++ b/server/src/main/java/org/elasticsearch/cluster/health/ClusterStateHealth.java @@ -134,7 +134,7 @@ public ClusterStateHealth(final StreamInput in) throws IOException { unassignedShards = in.readVInt(); numberOfNodes = in.readVInt(); numberOfDataNodes = in.readVInt(); - status = ClusterHealthStatus.fromValue(in.readByte()); + status = ClusterHealthStatus.readFrom(in); int size = in.readVInt(); indices = new HashMap<>(size); for (int i = 0; i < size; i++) { diff --git a/server/src/main/java/org/elasticsearch/cluster/metadata/ComponentTemplate.java b/server/src/main/java/org/elasticsearch/cluster/metadata/ComponentTemplate.java index 2b160b1b2fb11..0cc8c8f8a940d 100644 --- a/server/src/main/java/org/elasticsearch/cluster/metadata/ComponentTemplate.java +++ b/server/src/main/java/org/elasticsearch/cluster/metadata/ComponentTemplate.java @@ -24,26 +24,19 @@ import org.elasticsearch.common.Nullable; import org.elasticsearch.common.ParseField; import org.elasticsearch.common.Strings; -import org.elasticsearch.common.bytes.BytesArray; -import org.elasticsearch.common.compress.CompressedXContent; import org.elasticsearch.common.io.stream.StreamInput; import org.elasticsearch.common.io.stream.StreamOutput; -import org.elasticsearch.common.settings.Settings; import org.elasticsearch.common.xcontent.ConstructingObjectParser; import org.elasticsearch.common.xcontent.ToXContentObject; import org.elasticsearch.common.xcontent.XContentBuilder; -import org.elasticsearch.common.xcontent.XContentFactory; -import org.elasticsearch.common.xcontent.XContentHelper; import org.elasticsearch.common.xcontent.XContentParser; -import org.elasticsearch.common.xcontent.XContentType; import java.io.IOException; -import java.util.HashMap; import java.util.Map; import java.util.Objects; /** - * A component template is a re-usable template as well as metadata about the template. Each + * A component template is a re-usable {@link Template} as well as metadata about the template. Each * component template is expected to be valid on its own. For example, if a component template * contains a field "foo", it's expected to contain all the necessary settings/mappings/etc for the * "foo" field. These component templates make up the individual pieces composing an index template. @@ -157,144 +150,4 @@ public XContentBuilder toXContent(XContentBuilder builder, Params params) throws builder.endObject(); return builder; } - - static class Template extends AbstractDiffable