Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
39 commits
Select commit Hold shift + click to select a range
f8c7e67
HDFS-16628 RBF: Correct target directory when move to trash for kerbe…
zhangxiping1 Jun 15, 2022
1b25851
HADOOP-18159. Bump cos_api-bundle to 5.6.69 to update public-suffix-l…
andreAmorimF Jun 15, 2022
6cbeae2
HDFS-16581.Print node status when executing printTopology. (#4321)
jianghuazhu Jun 16, 2022
9e3fc40
HDFS-16613. EC: Improve performance of decommissioning dn with many e…
lfxy Jun 16, 2022
7bfff63
HADOOP-18289. Remove WhiteBox in hadoop-kms module. (#4433)
slfan1989 Jun 17, 2022
4893f00
HDFS-16600. Fix deadlock of fine-grain lock for FsDatastImpl of DataN…
ZanderXu Jun 17, 2022
020201c
Queue filter in CS UI v1 does not work as expected. Contributed by Ch…
brumi1024 Jun 17, 2022
e199da3
HADOOP-17833. Improve Magic Committer performance (#3289)
steveloughran Jun 17, 2022
80446dc
YARN-11172. Fix TestClientRMTokens#testDelegationToken introduced by …
zhengchenyu Jun 17, 2022
62e4476
YARN-10122. Support signalToContainer API for Federation. (#4421)
slfan1989 Jun 17, 2022
e38e13b
HADOOP-18288. Total requests and total requests per sec served by RPC…
virajjasani Jun 18, 2022
cb04210
HDFS-16634. Dynamically adjust slow peer report size on JMX metrics (…
virajjasani Jun 20, 2022
cfceaeb
HDFS-16064. Determine when to invalidate corrupt replicas based on nu…
KevinWikant Jun 20, 2022
4f425b6
YARN-9827.Fix Http Response code in GenericExceptionHandler (#4393)
hotcodemacha Jun 20, 2022
a77d522
HADOOP-18255. Fix fsdatainputstreambuilder.md reference to hadoop bra…
hotcodemacha Jun 20, 2022
efc2761
HDFS-16635.Fixed javadoc error in Java 11 (#4451)
hotcodemacha Jun 20, 2022
477b67a
HADOOP-18266. Using HashSet/ TreeSet Constructor for hadoop-common (#…
Samrat002 Jun 20, 2022
10fc865
MAPREDUCE-7387. Fix TestJHSSecurity#testDelegationToken AssertionErro…
slfan1989 Jun 20, 2022
36c4be8
MAPREDUCE-7369. Fixed MapReduce tasks timing out when spends more tim…
hotcodemacha Jun 20, 2022
5d08ffa
YARN-11182. Refactor TestAggregatedLogDeletionService: 2nd phase. Con…
9uapaw Jun 20, 2022
3a66348
YARN-11185. Pending app metrics are increased doubly when a queue rea…
szilard-nemeth Jun 20, 2022
7a1d811
HDFS-16637. TestHDFSCLI#testAll consistently failing (#4466). Contrib…
virajjasani Jun 21, 2022
ef36457
MAPREDUCE-7389. Fix typo in description of property (#4440). Contribu…
usev6 Jun 21, 2022
cbdabe9
YARN-9971.YARN Native Service HttpProbe logs THIS_HOST in error messa…
hotcodemacha Jun 22, 2022
e8fd914
HDFS-16616. remove use of org.apache.hadoop.util.Sets (#4400)
Samrat002 Jun 22, 2022
c9ddbd2
MAPREDUCE-7391. TestLocalDistributedCacheManager failing after HADOOP…
steveloughran Jun 22, 2022
e6ecc4f
YARN-11188. Only files belong to the first file controller are remove…
9uapaw Jun 22, 2022
2daf0a8
HADOOP-11867. Add a high-performance vectored read API. (#3904)
mukund-thakur Feb 1, 2022
5db0f34
HADOOP-18104: S3A: Add configs to configure minSeekForVectorReads and…
mukund-thakur Apr 29, 2022
1408dd8
HADOOP-18107 Adding scale test for vectored reads for large file (#4273)
mukund-thakur Jun 1, 2022
0d49bd2
HADOOP-18105 Implement buffer pooling with weak references (#4263)
mukund-thakur Jun 1, 2022
4d1f6f9
HADOOP-18106: Handle memory fragmentation in S3A Vectored IO. (#4445)
mukund-thakur Jun 20, 2022
e1842b2
HADOOP-18103. Add a high-performance vectored read API. (#4476)
steveloughran Jun 22, 2022
77d1b19
HADOOP-18300. Upgrade Gson dependency to version 2.9.0 (#4454)
medb Jun 22, 2022
dd819f7
HADOOP-18271.Remove unused Imports in Hadoop Common project (#4392)
hotcodemacha Jun 23, 2022
0af4bb3
YARN-11192. TestRouterWebServicesREST failing after YARN-9827. (#4484…
slfan1989 Jun 23, 2022
4abb2ba
YARN-10320.Replace FSDataInputStream#read with readFully in Log Aggre…
hotcodemacha Jun 23, 2022
734b6f1
YARN-9874.Remove unnecessary LevelDb write call in LeveldbConfigurati…
hotcodemacha Jun 23, 2022
b7edc6c
HDFS-16633. Fixing when Reserved Space For Replicas is not released o…
hotcodemacha Jun 24, 2022
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
2 changes: 1 addition & 1 deletion LICENSE-binary
Original file line number Diff line number Diff line change
Expand Up @@ -231,7 +231,7 @@ com.github.stephenc.jcip:jcip-annotations:1.0-1
com.google:guice:4.0
com.google:guice-servlet:4.0
com.google.api.grpc:proto-google-common-protos:1.0.0
com.google.code.gson:2.2.4
com.google.code.gson:2.9.0
com.google.errorprone:error_prone_annotations:2.2.0
com.google.j2objc:j2objc-annotations:1.1
com.google.json-simple:json-simple:1.1.1
Expand Down
2 changes: 1 addition & 1 deletion dev-support/Jenkinsfile
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ pipeline {

options {
buildDiscarder(logRotator(numToKeepStr: '5'))
timeout (time: 24, unit: 'HOURS')
timeout (time: 48, unit: 'HOURS')
timestamps()
checkoutToSubdirectory('src')
}
Expand Down
6 changes: 6 additions & 0 deletions hadoop-client-modules/hadoop-client-minicluster/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -757,6 +757,12 @@
<exclude>META-INF/versions/11/module-info.class</exclude>
</excludes>
</filter>
<filter>
<artifact>com.google.code.gson:gson</artifact>
<excludes>
<exclude>META-INF/versions/9/module-info.class</exclude>
</excludes>
</filter>

<!-- Mockito tries to include its own unrelocated copy of hamcrest. :( -->
<filter>
Expand Down
7 changes: 7 additions & 0 deletions hadoop-client-modules/hadoop-client-runtime/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -249,6 +249,13 @@
<exclude>META-INF/versions/11/module-info.class</exclude>
</excludes>
</filter>
<filter>
<artifact>com.google.code.gson:gson</artifact>
<excludes>
<exclude>META-INF/versions/9/module-info.class</exclude>
</excludes>
</filter>

</filters>
<relocations>
<relocation>
Expand Down
2 changes: 1 addition & 1 deletion hadoop-cloud-storage-project/hadoop-cos/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -109,7 +109,7 @@
<dependency>
<groupId>com.qcloud</groupId>
<artifactId>cos_api-bundle</artifactId>
<version>5.6.19</version>
<version>5.6.69</version>
<scope>compile</scope>
</dependency>

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,6 @@
package org.apache.hadoop.security.authentication.util;

import java.io.ByteArrayInputStream;
import java.io.UnsupportedEncodingException;
import java.nio.charset.StandardCharsets;
import java.security.PublicKey;
import java.security.cert.CertificateException;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -236,7 +236,7 @@ public static final String getServicePrincipal(String service,
*/
static final String[] getPrincipalNames(String keytabFileName) throws IOException {
Keytab keytab = Keytab.loadKeytab(new File(keytabFileName));
Set<String> principals = new HashSet<String>();
Set<String> principals = new HashSet<>();
List<PrincipalName> entries = keytab.getPrincipals();
for (PrincipalName entry : entries) {
principals.add(entry.getName().replace("\\", "/"));
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -108,9 +108,9 @@ public AppConfigurationEntry[] getAppConfigurationEntry(String name) {
public static <T> T doAs(String principal, final Callable<T> callable) throws Exception {
LoginContext loginContext = null;
try {
Set<Principal> principals = new HashSet<Principal>();
Set<Principal> principals = new HashSet<>();
principals.add(new KerberosPrincipal(KerberosTestUtils.getClientPrincipal()));
Subject subject = new Subject(false, principals, new HashSet<Object>(), new HashSet<Object>());
Subject subject = new Subject(false, principals, new HashSet<>(), new HashSet<>());
loginContext = new LoginContext("", subject, null, new KerberosConfiguration(principal));
loginContext.login();
subject = loginContext.getSubject();
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,6 @@
import java.util.List;
import java.util.ArrayList;
import java.util.Properties;
import java.util.Vector;
import java.util.Date;

import javax.servlet.ServletException;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,6 @@
*/
package org.apache.hadoop.security.authentication.server;

import org.apache.hadoop.security.authentication.client.AuthenticationException;
import org.apache.hadoop.security.authentication.client.PseudoAuthenticator;
import org.junit.Assert;
import org.junit.Test;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,6 @@
import javax.servlet.ServletContext;

import org.apache.hadoop.classification.VisibleForTesting;
import org.apache.hadoop.classification.InterfaceAudience;
import org.apache.hadoop.classification.InterfaceStability;
import org.apache.hadoop.security.authentication.server.AuthenticationFilter;

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -774,7 +774,7 @@ private void updatePropertiesWithDeprecatedKeys(
private void handleDeprecation() {
LOG.debug("Handling deprecation for all properties in config...");
DeprecationContext deprecations = deprecationContext.get();
Set<Object> keys = new HashSet<Object>();
Set<Object> keys = new HashSet<>();
keys.addAll(getProps().keySet());
for (Object item: keys) {
LOG.debug("Handling deprecation for " + (String)item);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -25,10 +25,6 @@
import java.util.List;
import java.util.ListIterator;

import javax.crypto.Cipher;
import javax.crypto.spec.IvParameterSpec;
import javax.crypto.spec.SecretKeySpec;

import org.apache.hadoop.util.Preconditions;
import org.apache.hadoop.classification.InterfaceAudience;
import org.apache.hadoop.crypto.CryptoCodec;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,6 @@
import java.util.concurrent.atomic.AtomicInteger;

import javax.net.ssl.SSLException;
import javax.net.ssl.SSLHandshakeException;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.crypto.key.KeyProvider;
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
/**
/*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
Expand All @@ -22,6 +22,9 @@
import java.io.FileDescriptor;
import java.io.IOException;
import java.util.StringJoiner;
import java.nio.ByteBuffer;
import java.util.List;
import java.util.function.IntFunction;

import org.apache.hadoop.classification.InterfaceAudience;
import org.apache.hadoop.classification.InterfaceStability;
Expand Down Expand Up @@ -158,8 +161,24 @@ public IOStatistics getIOStatistics() {
@Override
public String toString() {
return new StringJoiner(", ",
BufferedFSInputStream.class.getSimpleName() + "[", "]")
.add("in=" + in)
.toString();
BufferedFSInputStream.class.getSimpleName() + "[", "]")
.add("in=" + in)
.toString();
}

@Override
public int minSeekForVectorReads() {
return ((PositionedReadable) in).minSeekForVectorReads();
}

@Override
public int maxReadSizeForVectorReads() {
return ((PositionedReadable) in).maxReadSizeForVectorReads();
}

@Override
public void readVectored(List<? extends FileRange> ranges,
IntFunction<ByteBuffer> allocate) throws IOException {
((PositionedReadable) in).readVectored(ranges, allocate);
}
}
Loading