Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problem modifying file code with static import. #2927

Closed
LuanPereiraLima opened this issue Apr 3, 2019 · 29 comments
Closed

Problem modifying file code with static import. #2927

LuanPereiraLima opened this issue Apr 3, 2019 · 29 comments
Labels

Comments

@LuanPereiraLima
Copy link

When I modify the file (I only modify the parts that have TRY CATCH). Everything that is modified with static (example: import static org.fusesource.leveldbjni.JniDBFactory.asString;), are removed from the code, so when I am going to compile the code again I have not found variable errors, which are exactly these static imports .

I have here the two files for comparison.
I will show only the imports.

ORIGINAL CODE:

package org.apache.hadoop.mapred;

import static org.fusesource.leveldbjni.JniDBFactory.asString;
import static org.fusesource.leveldbjni.JniDBFactory.bytes;
import static org.jboss.netty.buffer.ChannelBuffers.wrappedBuffer;
import static org.jboss.netty.handler.codec.http.HttpHeaders.Names.CONTENT_TYPE;
import static org.jboss.netty.handler.codec.http.HttpMethod.GET;
import static org.jboss.netty.handler.codec.http.HttpResponseStatus.BAD_REQUEST;
import static org.jboss.netty.handler.codec.http.HttpResponseStatus.FORBIDDEN;
import static org.jboss.netty.handler.codec.http.HttpResponseStatus.INTERNAL_SERVER_ERROR;
import static org.jboss.netty.handler.codec.http.HttpResponseStatus.METHOD_NOT_ALLOWED;
import static org.jboss.netty.handler.codec.http.HttpResponseStatus.NOT_FOUND;
import static org.jboss.netty.handler.codec.http.HttpResponseStatus.OK;
import static org.jboss.netty.handler.codec.http.HttpResponseStatus.UNAUTHORIZED;
import static org.jboss.netty.handler.codec.http.HttpVersion.HTTP_1_1;

import java.io.File;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.io.RandomAccessFile;
import java.net.InetSocketAddress;
import java.net.URL;
import java.nio.ByteBuffer;
import java.nio.channels.ClosedChannelException;
import java.util.ArrayList;
import java.util.Collections;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.concurrent.ConcurrentHashMap;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.ThreadFactory;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.atomic.AtomicInteger;
import java.util.regex.Pattern;

import javax.crypto.SecretKey;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.DataInputByteBuffer;
import org.apache.hadoop.io.DataOutputBuffer;
import org.apache.hadoop.io.ReadaheadPool;
import org.apache.hadoop.io.SecureIOUtils;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapred.proto.ShuffleHandlerRecoveryProtos.JobShuffleInfoProto;
import org.apache.hadoop.mapreduce.MRConfig;
import org.apache.hadoop.mapreduce.security.SecureShuffleUtils;
import org.apache.hadoop.mapreduce.security.token.JobTokenIdentifier;
import org.apache.hadoop.mapreduce.security.token.JobTokenSecretManager;
import org.apache.hadoop.mapreduce.task.reduce.ShuffleHeader;
import org.apache.hadoop.metrics2.MetricsSystem;
import org.apache.hadoop.metrics2.annotation.Metric;
import org.apache.hadoop.metrics2.annotation.Metrics;
import org.apache.hadoop.metrics2.lib.DefaultMetricsSystem;
import org.apache.hadoop.metrics2.lib.MutableCounterInt;
import org.apache.hadoop.metrics2.lib.MutableCounterLong;
import org.apache.hadoop.metrics2.lib.MutableGaugeInt;
import org.apache.hadoop.security.proto.SecurityProtos.TokenProto;
import org.apache.hadoop.security.ssl.SSLFactory;
import org.apache.hadoop.security.token.Token;
import org.apache.hadoop.util.DiskChecker;
import org.apache.hadoop.util.Shell;
import org.apache.hadoop.util.concurrent.HadoopExecutors;
import org.apache.hadoop.yarn.api.records.ApplicationId;
import org.apache.hadoop.yarn.proto.YarnServerCommonProtos.VersionProto;
import org.apache.hadoop.yarn.server.api.ApplicationInitializationContext;
import org.apache.hadoop.yarn.server.api.ApplicationTerminationContext;
import org.apache.hadoop.yarn.server.api.AuxiliaryService;
import org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer;
import org.apache.hadoop.yarn.server.records.Version;
import org.apache.hadoop.yarn.server.records.impl.pb.VersionPBImpl;
import org.apache.hadoop.yarn.server.utils.LeveldbIterator;
import org.fusesource.leveldbjni.JniDBFactory;
import org.fusesource.leveldbjni.internal.NativeDB;
import org.iq80.leveldb.DB;
import org.iq80.leveldb.DBException;
import org.iq80.leveldb.Options;
import org.jboss.netty.bootstrap.ServerBootstrap;
import org.jboss.netty.buffer.ChannelBuffers;
import org.jboss.netty.channel.Channel;
import org.jboss.netty.channel.ChannelFactory;
import org.jboss.netty.channel.ChannelFuture;
import org.jboss.netty.channel.ChannelFutureListener;
import org.jboss.netty.channel.ChannelHandler;
import org.jboss.netty.channel.ChannelHandlerContext;
import org.jboss.netty.channel.ChannelPipeline;
import org.jboss.netty.channel.ChannelPipelineFactory;
import org.jboss.netty.channel.ChannelStateEvent;
import org.jboss.netty.channel.Channels;
import org.jboss.netty.channel.ExceptionEvent;
import org.jboss.netty.channel.MessageEvent;
import org.jboss.netty.channel.SimpleChannelUpstreamHandler;
import org.jboss.netty.channel.group.ChannelGroup;
import org.jboss.netty.channel.group.DefaultChannelGroup;
import org.jboss.netty.channel.socket.nio.NioServerSocketChannelFactory;
import org.jboss.netty.handler.codec.frame.TooLongFrameException;
import org.jboss.netty.handler.codec.http.DefaultHttpResponse;
import org.jboss.netty.handler.codec.http.HttpChunkAggregator;
import org.jboss.netty.handler.codec.http.HttpRequest;
import org.jboss.netty.handler.codec.http.HttpRequestDecoder;
import org.jboss.netty.handler.codec.http.HttpResponse;
import org.jboss.netty.handler.codec.http.HttpResponseEncoder;
import org.jboss.netty.handler.codec.http.HttpResponseStatus;
import org.jboss.netty.handler.codec.http.QueryStringDecoder;
import org.jboss.netty.handler.ssl.SslHandler;
import org.jboss.netty.handler.stream.ChunkedWriteHandler;
import org.jboss.netty.handler.timeout.IdleState;
import org.jboss.netty.handler.timeout.IdleStateAwareChannelHandler;
import org.jboss.netty.handler.timeout.IdleStateEvent;
import org.jboss.netty.handler.timeout.IdleStateHandler;
import org.jboss.netty.util.CharsetUtil;
import org.jboss.netty.util.HashedWheelTimer;
import org.jboss.netty.util.Timer;
import org.eclipse.jetty.http.HttpHeader;
import org.slf4j.LoggerFactory;

import com.google.common.annotations.VisibleForTesting;
import com.google.common.base.Charsets;
import com.google.common.cache.CacheBuilder;
import com.google.common.cache.CacheLoader;
import com.google.common.cache.LoadingCache;
import com.google.common.cache.RemovalListener;
import com.google.common.cache.RemovalNotification;
import com.google.common.cache.Weigher;
import com.google.common.util.concurrent.ThreadFactoryBuilder;
import com.google.protobuf.ByteString;

OUTPUT CODE:

package org.apache.hadoop.mapred;


import ChannelFutureListener.CLOSE;
import Charsets.UTF_8;
import HttpHeader.CONNECTION;
import HttpHeader.CONTENT_LENGTH;
import HttpHeader.KEEP_ALIVE;
import JniDBFactory.factory;
import JobID.JOBID_REGEX;
import MRConfig.SHUFFLE_SSL_ENABLED_DEFAULT;
import MRConfig.SHUFFLE_SSL_ENABLED_KEY;
import SSLFactory.Mode;
import SecureShuffleUtils.HTTP_HEADER_REPLY_URL_HASH;
import SecureShuffleUtils.HTTP_HEADER_URL_HASH;
import ShuffleHeader.DEFAULT_HTTP_HEADER_NAME;
import ShuffleHeader.DEFAULT_HTTP_HEADER_VERSION;
import ShuffleHeader.HTTP_HEADER_NAME;
import ShuffleHeader.HTTP_HEADER_VERSION;
import com.google.common.annotations.VisibleForTesting;
import com.google.common.cache.CacheBuilder;
import com.google.common.cache.LoadingCache;
import com.google.common.cache.RemovalNotification;
import com.google.common.util.concurrent.ThreadFactoryBuilder;
import com.google.protobuf.ByteString;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.io.RandomAccessFile;
import java.net.InetSocketAddress;
import java.net.URL;
import java.nio.ByteBuffer;
import java.nio.channels.ClosedChannelException;
import java.util.ArrayList;
import java.util.Collections;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.concurrent.ConcurrentHashMap;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.ThreadFactory;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.atomic.AtomicInteger;
import java.util.regex.Pattern;
import javax.crypto.SecretKey;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.DataInputByteBuffer;
import org.apache.hadoop.io.DataOutputBuffer;
import org.apache.hadoop.io.ReadaheadPool;
import org.apache.hadoop.io.SecureIOUtils;
import org.apache.hadoop.mapred.proto.ShuffleHandlerRecoveryProtos.JobShuffleInfoProto;
import org.apache.hadoop.mapreduce.security.SecureShuffleUtils;
import org.apache.hadoop.mapreduce.security.token.JobTokenIdentifier;
import org.apache.hadoop.mapreduce.security.token.JobTokenSecretManager;
import org.apache.hadoop.mapreduce.task.reduce.ShuffleHeader;
import org.apache.hadoop.metrics2.MetricsSystem;
import org.apache.hadoop.metrics2.annotation.Metric;
import org.apache.hadoop.metrics2.annotation.Metrics;
import org.apache.hadoop.metrics2.lib.DefaultMetricsSystem;
import org.apache.hadoop.metrics2.lib.MutableCounterInt;
import org.apache.hadoop.metrics2.lib.MutableCounterLong;
import org.apache.hadoop.metrics2.lib.MutableGaugeInt;
import org.apache.hadoop.security.proto.SecurityProtos.TokenProto;
import org.apache.hadoop.security.ssl.SSLFactory;
import org.apache.hadoop.security.token.Token;
import org.apache.hadoop.util.DiskChecker;
import org.apache.hadoop.util.Shell;
import org.apache.hadoop.util.concurrent.HadoopExecutors;
import org.apache.hadoop.yarn.api.records.ApplicationId;
import org.apache.hadoop.yarn.proto.YarnServerCommonProtos.VersionProto;
import org.apache.hadoop.yarn.server.api.ApplicationInitializationContext;
import org.apache.hadoop.yarn.server.api.ApplicationTerminationContext;
import org.apache.hadoop.yarn.server.api.AuxiliaryService;
import org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer;
import org.apache.hadoop.yarn.server.records.Version;
import org.apache.hadoop.yarn.server.utils.LeveldbIterator;
import org.fusesource.leveldbjni.internal.NativeDB;
import org.iq80.leveldb.DB;
import org.iq80.leveldb.DBException;
import org.iq80.leveldb.Options;
import org.jboss.netty.bootstrap.ServerBootstrap;
import org.jboss.netty.buffer.ChannelBuffers;
import org.jboss.netty.channel.Channel;
import org.jboss.netty.channel.ChannelFactory;
import org.jboss.netty.channel.ChannelFuture;
import org.jboss.netty.channel.ChannelFutureListener;
import org.jboss.netty.channel.ChannelHandler;
import org.jboss.netty.channel.ChannelHandlerContext;
import org.jboss.netty.channel.ChannelPipeline;
import org.jboss.netty.channel.ChannelPipelineFactory;
import org.jboss.netty.channel.ChannelStateEvent;
import org.jboss.netty.channel.Channels;
import org.jboss.netty.channel.ExceptionEvent;
import org.jboss.netty.channel.MessageEvent;
import org.jboss.netty.channel.SimpleChannelUpstreamHandler;
import org.jboss.netty.channel.group.ChannelGroup;
import org.jboss.netty.channel.group.DefaultChannelGroup;
import org.jboss.netty.handler.codec.frame.TooLongFrameException;
import org.jboss.netty.handler.codec.http.HttpChunkAggregator;
import org.jboss.netty.handler.codec.http.HttpRequest;
import org.jboss.netty.handler.codec.http.HttpRequestDecoder;
import org.jboss.netty.handler.codec.http.HttpResponse;
import org.jboss.netty.handler.codec.http.HttpResponseEncoder;
import org.jboss.netty.handler.codec.http.HttpResponseStatus;
import org.jboss.netty.handler.ssl.SslHandler;
import org.jboss.netty.handler.stream.ChunkedWriteHandler;
import org.jboss.netty.handler.timeout.IdleState;
import org.jboss.netty.handler.timeout.IdleStateAwareChannelHandler;
import org.jboss.netty.handler.timeout.IdleStateEvent;
import org.jboss.netty.util.HashedWheelTimer;
import org.jboss.netty.util.Timer;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

this is the SPOON (7.3.0) configuration that I use in code.

spoon = new Launcher();
spoon.getEnvironment().setNoClasspath(true);
spoon.addInputResource(f.getAbsolutePath());
spoon.getEnvironment().setCommentEnabled(true);
spoon.getEnvironment().setAutoImports(true);
spoon.setSourceOutputDirectory(PathProject.getPathTemp());

I have already tried using spoon.getEnvironment().setAutoImports(false); also, but I continued with the same problem.

@monperrus
Copy link
Collaborator

Thanks for the bug report. Could you create a pull-request with a failing test case?

@LuanPereiraLima
Copy link
Author

I'm not sure how to create a test to show this BUG. Should I go in the "/src/test/java/spoon/test" folder and I memo create a case with my tested files?

@nharrand
Copy link
Collaborator

nharrand commented Apr 4, 2019

You can put the test data (i.e. the file containing source code with static import imports) in src/test/resources/static-import/StaticImport.java, and the code for the actual test in src/test/java/spoon/StaticImportTest.java

@LuanPereiraLima
Copy link
Author

I created here a test scene in which it generates the file shown above. How does the creation of PULL REQUEST work?

But my test only generates the OUTPUT in /target/spoon/static_imports/src/org/apache/hadoop/mapred/ShuffleHandler.java

for you to analyze the code.

@LuanPereiraLima
Copy link
Author

outputTest.txt

this is the output of the test, the compilation errors in HADOOP.

@LuanPereiraLima
Copy link
Author

spoon.getEnvironment().setAutoImports(false);

outputTestAutoImportFALSE.txt

Here is the output of the HADOOP tests when I make the modification using setAutoImports (false).

@nharrand
Copy link
Collaborator

nharrand commented Apr 4, 2019

To create a pull request, you need to (assuming that you have already cloned spoon on your machine)

  1. fork the project (spoon) and add your fork as a remote on your local machine (git remote add myfork url-of-your-fork)
  2. (Optional) create a branch (git checkout -b my-patch)
  3. Make the changes that you want to include on your local branch and commit them
  4. Push your changes on your fork (git push myfork my-patch)
  5. From your github fork page create the pull request.

Note that once your pull request is created (and until it is accepted or refused) you will still be able to push new changes.

Here is some relevant documentation

LuanPereiraLima pushed a commit to LuanPereiraLima/spoon that referenced this issue Apr 4, 2019
LuanPereiraLima pushed a commit to LuanPereiraLima/spoon that referenced this issue Apr 4, 2019
@LuanPereiraLima
Copy link
Author

Okay, I think I made it.

@monperrus monperrus added the bug label Apr 4, 2019
@LuanPereiraLima
Copy link
Author

Guys, my master degree needs this :(

@monperrus
Copy link
Collaborator

monperrus commented Apr 5, 2019

instead of :( go for a more positive emoji 🍺

@nharrand
Copy link
Collaborator

nharrand commented Apr 5, 2019

Ok after investigation, this problem should only happen in NoClassPath mode. So if you have the possibility to give the full classpath, you could easily get around it.

For exemple, if the sources that you are processing are part of a maven project, I suggest you use the MavenLauncher (it will compute the classpath for you).
With the classic Launcher, you can add classpath element with:

Launcher spoon;
spoon = new Launcher();
spoon.getEnvironment().setSourceClasspath(new String[]{path});

Note that in this context, path refers to a directory containing bytecode of dependencies used by your sources and not java sources.

Anyway, this is still an issue that we should fix. I am only suggesting this if this is a pressing issue for you!

@nharrand
Copy link
Collaborator

nharrand commented Apr 5, 2019

Depending on what you want to do with the transformed sources, you could also consider using

spoon.getEnvironment().setAutoImports(false);

In this case your processed source will be printed with no imports at all but the code will directly refer to fully qulified name, making these import unecessary for compilation (but less human readable).

A last option you may consider would be to use the SnipperPrettyPrinter with

launcher.getEnvironment().setPrettyPrinterCreator(() -> {
   return new SniperJavaPrettyPrinter(launcher.getEnvironment());
  }
);

This should only changer part of the source file that you transformed. (And hopefully, import won't be impacted.

To sums things up, to bypass this problem you have 4 options:

  1. set up the classpath
  2. use MavenLauncher (if the project you are analyzing / processing uses Maven)
  3. disable auto import
  4. use the SniperPrettyPrinter

This page of the documentation should help with all these options.

@LuanPereiraLima
Copy link
Author

I have tested your solutions that showed me. I used the Maven Launcher and got the same problem (compile error for 'can not find symbol'). I did not test the first, as Maven Launcher automatically adds the classpath. With SniperPrettyPrinter, I get errors at the time of modifying the file (besides the error in the console, some java classes are without anything written). I also tested with another HADOOP subproject, but I have the same results.

@nharrand
Copy link
Collaborator

I am having a hard time reproducing your errors. Can you give me a little bit more context on what you want to achieve and the errors that you get?

@LuanPereiraLima
Copy link
Author

LuanPereiraLima commented Apr 10, 2019

Yes. I'm making modifications to HADOOP subsystems (https://github.com/apache/hadoop). The example I cited above was the (https://github.com/apache/hadoop/tree/trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle). I used the class (https://github.com/apache/hadoop/blob/trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java) to remove block catchs and add a finally block (if it does not exist). I make modifications to the try catchs blocks, as they are part of my research (creation of mutations with focus on exception handling). I already used several systems to do this, now I was using HADOOP and I came across these problems that I showed you. Basically when modifying the code, the java file generated by the modification generates compilation error, due to the lack of imports that are not added as they should by SPOON. You can do a test to see the problem: Get the same java class I showed above in the repository, and make a simple modification to it (add only one comment), get the modified output class and replace it in the HADOOP project and run the tests (subproject only) you will see several compilation errors that did not exist before. When I analyzed the classes I noticed that static imports were not being added as they should.

@LuanPereiraLima
Copy link
Author

Did you get the same problem?

@monperrus
Copy link
Collaborator

No we cannot reproduce your problem.

Hadoop is a huge project with a very complex build and classpath setup. Can you try with something that is smaller (see projects of Table 5 of this paper) and tell us how it goes?

@LuanPereiraLima
Copy link
Author

Ah, but these dependency problems can be solved by running the configuration file: https://github.com/apache/hadoop/blob/trunk/start-build-env.sh, from this a container in docker runs , and there are already all the dependencies installed. When you execute this file on the machine you can already execute the tests of this subproject that I showed. I've already run SPOON on other projects, and what interests me now for the master degree is HADOOP.

@monperrus
Copy link
Collaborator

I understand.

Then we have to work on the exact same bug.

Could you reproduce the bug on Travis using docker in the dedicated repo https://github.com/SpoonLabs/spoon-hadoop? (you have the right to push here)

@LuanPereiraLima
Copy link
Author

I have never used Travis, I do not know how to proceed to show you the result of the problem using it. I use an image of docker here on my computer, in case I should commmit this image and send it to the Docker Cloud? And then generate a script on this travis for him to use that image? Or something.

@monperrus
Copy link
Collaborator

monperrus commented Apr 14, 2019 via email

@monperrus
Copy link
Collaborator

Fixed in master by #2936

@monperrus
Copy link
Collaborator

Thanks for the bug report.

@LuanPereiraLima
Copy link
Author

#2988

@LuanPereiraLima
Copy link
Author

Here below is the compilation problem.

[ERROR] COMPILATION ERROR :
[INFO] -------------------------------------------------------------
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[21,29] package ChannelFutureListener does not exist
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[22,16] package Charsets does not exist
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[23,18] package HttpHeader does not exist
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[24,18] package HttpHeader does not exist
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[25,18] package HttpHeader does not exist
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[26,20] package JniDBFactory does not exist
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[27,13] package JobID does not exist
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[28,16] package MRConfig does not exist
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[29,16] package MRConfig does not exist
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[30,18] package SSLFactory does not exist
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[31,26] package SecureShuffleUtils does not exist
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[32,26] package SecureShuffleUtils does not exist
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[33,21] package ShuffleHeader does not exist
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[34,21] package ShuffleHeader does not exist
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[35,21] package ShuffleHeader does not exist
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[36,21] package ShuffleHeader does not exist
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[604,50] cannot find symbol
symbol: variable JOBID_REGEX
location: class org.apache.hadoop.mapred.ShuffleHandler
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[608,27] cannot find symbol
symbol: method bytes(java.lang.String)
location: class org.apache.hadoop.mapred.ShuffleHandler
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[611,34] cannot find symbol
symbol: method asString(byte[])
location: class org.apache.hadoop.mapred.ShuffleHandler
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[634,23] cannot find symbol
symbol: variable factory
location: class org.apache.hadoop.mapred.ShuffleHandler
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[635,18] incompatible types: org.fusesource.leveldbjni.internal.NativeDB cannot be converted to java.lang.Throwable
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[636,19] cannot find symbol
symbol: method isNotFound()
location: variable e of type org.fusesource.leveldbjni.internal.NativeDB
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[636,39] cannot find symbol
symbol: method getMessage()
location: variable e of type org.fusesource.leveldbjni.internal.NativeDB
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[640,31] cannot find symbol
symbol: variable factory
location: class org.apache.hadoop.mapred.ShuffleHandler
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[646,17] incompatible types: org.fusesource.leveldbjni.internal.NativeDB cannot be converted to java.lang.Throwable
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[654,35] cannot find symbol
symbol: method bytes(java.lang.String)
location: class org.apache.hadoop.mapred.ShuffleHandler
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[667,9] cannot find symbol
symbol: method getProto()
location: class org.apache.hadoop.mapred.ShuffleHandler
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[669,25] cannot find symbol
symbol: method bytes(java.lang.String)
location: class org.apache.hadoop.mapred.ShuffleHandler
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[753,29] cannot find symbol
symbol: method bytes(java.lang.String)
location: class org.apache.hadoop.mapred.ShuffleHandler
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[767,32] cannot find symbol
symbol: method bytes(java.lang.String)
location: class org.apache.hadoop.mapred.ShuffleHandler
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[799,33] cannot find symbol
symbol: variable SHUFFLE_SSL_ENABLED_KEY
location: class org.apache.hadoop.mapred.ShuffleHandler.HttpPipelineFactory
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[800,13] cannot find symbol
symbol: variable SHUFFLE_SSL_ENABLED_DEFAULT
location: class org.apache.hadoop.mapred.ShuffleHandler.HttpPipelineFactory
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[802,45] cannot find symbol
symbol: variable Mode
location: class org.apache.hadoop.mapred.ShuffleHandler.HttpPipelineFactory
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[936,43] cannot find symbol
symbol: variable GET
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[936,42] illegal start of type
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[937,32] cannot find symbol
symbol: variable METHOD_NOT_ALLOWED
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[941,20] cannot find symbol
symbol: variable DEFAULT_HTTP_HEADER_NAME
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[943,35] cannot find symbol
symbol: variable HTTP_HEADER_NAME
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[944,16] cannot find symbol
symbol: variable DEFAULT_HTTP_HEADER_VERSION
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[947,13] cannot find symbol
symbol: variable HTTP_HEADER_VERSION
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[948,72] cannot find symbol
symbol: variable BAD_REQUEST
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[951,13] cannot find symbol
symbol: method getParameters()
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[973,70] cannot find symbol
symbol: variable BAD_REQUEST
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[977,66] cannot find symbol
symbol: variable BAD_REQUEST
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[987,56] cannot find symbol
symbol: variable BAD_REQUEST
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[990,53] cannot find symbol
symbol: variable BAD_REQUEST
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[996,32] cannot find symbol
symbol: variable FORBIDDEN
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[999,96] cannot find symbol
symbol: variable HTTP_1_1
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[999,106] cannot find symbol
symbol: variable OK
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[1005,48] cannot find symbol
symbol: variable UNAUTHORIZED
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[1025,46] cannot find symbol
symbol: variable INTERNAL_SERVER_ERROR
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[1072,59] cannot find symbol
symbol: variable NOT_FOUND
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[1084,21] cannot find symbol
symbol: variable INTERNAL_SERVER_ERROR
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[1188,40] cannot find symbol
symbol: variable CONNECTION
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[1191,40] cannot find symbol
symbol: variable CONTENT_LENGTH
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[1193,40] cannot find symbol
symbol: variable CONNECTION
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[1194,17] cannot find symbol
symbol: variable KEEP_ALIVE
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[1195,40] cannot find symbol
symbol: variable KEEP_ALIVE
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[1223,35] cannot find symbol
symbol: variable HTTP_HEADER_URL_HASH
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[1237,65] cannot find symbol
symbol: variable UTF_8
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[1240,13] cannot find symbol
symbol: variable HTTP_HEADER_REPLY_URL_HASH
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[1242,36] cannot find symbol
symbol: variable HTTP_HEADER_NAME
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[1243,13] cannot find symbol
symbol: variable DEFAULT_HTTP_HEADER_NAME
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[1244,36] cannot find symbol
symbol: variable HTTP_HEADER_VERSION
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[1245,13] cannot find symbol
symbol: variable DEFAULT_HTTP_HEADER_VERSION
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[1261,22] cannot find symbol
symbol: method wrappedBuffer(byte[],int,int)
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[1314,96] cannot find symbol
symbol: variable HTTP_1_1
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[1315,36] cannot find symbol
symbol: variable CONTENT_TYPE
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[1317,36] cannot find symbol
symbol: variable HTTP_HEADER_NAME
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[1318,13] cannot find symbol
symbol: variable DEFAULT_HTTP_HEADER_NAME
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[1319,36] cannot find symbol
symbol: variable HTTP_HEADER_VERSION
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[1320,13] cannot find symbol
symbol: variable DEFAULT_HTTP_HEADER_VERSION
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[1325,46] cannot find symbol
symbol: variable CharsetUtil
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[1328,58] cannot find symbol
symbol: variable CLOSE
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[1337,32] cannot find symbol
symbol: variable BAD_REQUEST
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[ERROR] /home/loopback/mutations/hadoop-3.1.2-12-hadoop-mapreduce-project--hadoop-mapreduce-client--hadoop-mapreduce-client-shuffle/CBD/1/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle/src/main/java/org/apache/hadoop/mapred/ShuffleHandler.java:[1354,32] cannot find symbol
symbol: variable INTERNAL_SERVER_ERROR
location: class org.apache.hadoop.mapred.ShuffleHandler.Shuffle
[INFO] 76 errors

Here is the part of the modified code.

CodeBefore=
// TODO these bytes should be versioned
try {
Token jt = ShuffleHandler.deserializeServiceData(secret);
// TODO: Once SHuffle is out of NM, this can use MR APIs
JobID jobId = new JobID(Long.toString(appId.getClusterTimestamp()), appId.getId());
recordJobShuffleInfo(jobId, user, jt);
} catch (IOException e) {
ShuffleHandler.LOG.error("Error during initApp", e);
// TODO add API to AuxiliaryServices to report failures
},

CodeAfter=
// TODO these bytes should be versioned
try {
Token jt = ShuffleHandler.deserializeServiceData(secret);
// TODO: Once SHuffle is out of NM, this can use MR APIs
JobID jobId = new JobID(Long.toString(appId.getClusterTimestamp()), appId.getId());
recordJobShuffleInfo(jobId, user, jt);
} finally {}

SPOON CONFIGURATION:

spoon.getEnvironment().setNoClasspath(true);
spoon.getEnvironment().setCommentEnabled(true);
spoon.getEnvironment().setAutoImports(true);
spoon.getEnvironment().setPreserveLineNumbers(true);

@krakowski
Copy link
Contributor

krakowski commented Aug 15, 2019

#2936 seems to have fixed this only when using a PrettyPrinter obtained through Launcher::createPrettyPrinter.

When using CtType::toStringWithImports, (unresolved) static imports are still removed.

Example

Launcher launcher = new Launcher();
launcher.getEnvironment().setNoClasspath(true);
launcher.getEnvironment().setAutoImports(true);
launcher.addInputResource(
    new VirtualFile("package com.test;\n\n" +
                    "\n" +
                    "import static org.assertj.core.api.Assertions.*;\n" +
                    "\n" +
                    "public class ImportTest {\n" +
                    "\n" +
                    "    public static void foo() {\n" +
                    "        assertThat(42).isEqualTo(42);\n" +
                    "    }\n" +
                    "}"));

launcher.buildModel();
System.out.println(launcher.getFactory().Class().getAll().get(0).toStringWithImports());

Result

package com.test;


public class ImportTest {
    public static void foo() {
        assertThat(42).isEqualTo(42);
    }
}

Using a PrettyPrinter seems to break the code.

Example

Launcher launcher = new Launcher();
launcher.getEnvironment().setNoClasspath(true);
launcher.getEnvironment().setAutoImports(true);
launcher.addInputResource(
        new VirtualFile("package com.test;\n\n" +
                        "\n" +
                        "import static org.assertj.core.api.Assertions.*;\n" +
                        "\n" +
                        "public class ImportTest {\n" +
                        "\n" +
                        "    public static void foo() {\n" +
                        "        assertThat(42).isEqualTo(42);\n" +
                        "    }\n" +
                        "}"));

launcher.buildModel();
PrettyPrinter printer = launcher.createPrettyPrinter();
CtType<?> type = launcher.getFactory().Class().getAll().get(0);
printer.calculate(type.getPosition().getCompilationUnit(), List.of(type));
System.out.println(printer.getResult());

Result

package com.test;


import static org.assertj.core.api.Assertions.*;


public class ImportTest {
    public static void foo() {
        isEqualTo(42);
    }
}

Note how assertThat(42). was removed from the method's body.

@monperrus
Copy link
Collaborator

Thanks for the bug report, reopening this one.

Would you have a look at a possible fix, inspired from #2936?

@monperrus monperrus reopened this Aug 17, 2019
@krakowski
Copy link
Contributor

I stepped through the code and found that DefaultJavaPrettyPrinter::isImported returns true although (I think) it should not. If I manually set isImported to false my test case does not fail.

package com.test;

import static org.assertj.core.api.Assertions.*;

public class StaticImports {
  public static void foo() {
    assertThat(42).isEqualTo(42);
  }
}
@Test
public void testStaticMethodImports() {
  Launcher launcher = new Launcher();
  launcher.getEnvironment().setNoClasspath(true);
  launcher.getEnvironment().setAutoImports(true);
  launcher.addInputResource("./src/test/resources/noclasspath/imports/StaticImports.java");
  launcher.buildModel();

  DefaultJavaPrettyPrinter printer = new DefaultJavaPrettyPrinter(launcher.getEnvironment());
  printer.calculate(null, Collections.singletonList(launcher.getFactory().Class().getAll().get(0)));

  String result = printer.getResult();

  assertTrue(result.contains("assertThat(42)"));
}

Looking further, ImportScannerImpl::visitCtExecutableReference creates an import for isEqualTo(int) which should not happen in noclasspath mode. Is there a way to check whether the parent of a CtExecutableReference is a method invocation itself?

@monperrus
Copy link
Collaborator

Great analysis.

Is there a way to check whether the parent of a CtExecutableReference is a method invocation itself?

Something like ctExecRef.getParent() instanceof CtInvocation

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants