Skip to content
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -142,8 +142,6 @@ private[sql] class HiveSessionCatalog(
// let's try to load it as a Hive's built-in function.
// Hive is case insensitive.
val functionName = funcName.unquotedString.toLowerCase(Locale.ROOT)
logWarning("Encountered a failure during looking up function:" +
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why don't we debug or info that we'll try fallback while we're here?

Copy link
Member Author

@dongjoon-hyun dongjoon-hyun Jun 3, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes. Of course, we can add debug messages any places. But, this is a natural flow for Hive built-in function lookup. To showing a message like Encountered a failure during looking up function: NoSuchFunctionException for every Hive function at the first invocation will not give much values to the developer/users. Also, it's misleading to the users because Spark will execute the function without any problem at the same exeuction.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I meant to add a debug message like "Hive fallback will be attempted for the function [abc]." because users/dev might not know which function exist in Spark and which function comes from Hive or at least where the fallback happens.

This warn should be removed anyway ..

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

let's talk about this somewhere else later. I think it's not a big deal at all. LGTM it should be removed anyway.

Copy link
Member Author

@dongjoon-hyun dongjoon-hyun Jun 3, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see, @HyukjinKwon .
For that message, I guess that the reason why the code doesn't have debug/info originally is that the Apache Spark's ambition was to implement everything in Spark-side and to remove this fallback logic completely. :)

s" ${Utils.exceptionString(error)}")
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should not warn here.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe this is intended for the true case of if (super.functionExists(name)) above? like L138?

Copy link
Member Author

@dongjoon-hyun dongjoon-hyun Jun 3, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ya. Maybe, but I'm not sure about the original intention. For that line 138, since we already throw error, actually, we don't need print warning here with the same error.

if (!hiveFunctions.contains(functionName)) {
failFunctionLookup(funcName, Some(error))
}
Expand Down