-
Notifications
You must be signed in to change notification settings - Fork 76
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
1011 Disable full stack trace when using spark connect #1024
base: master
Are you sure you want to change the base?
1011 Disable full stack trace when using spark connect #1024
Conversation
…ould-not-print-full-stack-trace
is_pyspark_analysis_exception = ( | ||
isinstance(error, AnalysisException) if AnalysisException else False | ||
) | ||
return ( | ||
any(msg in str(error) for msg in specific_db_errors) | ||
or is_pyspark_analysis_exception | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If AnalysisException is imported then checks if the error is of instance of pyspark's Analysis Exception and handles it accordingly
try: | ||
from pyspark.sql.utils import AnalysisException | ||
except ModuleNotFoundError: | ||
AnalysisException = None | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is to handle the case where pyspark module is not installed
@@ -9,9 +9,9 @@ | |||
|
|||
|
|||
def handle_spark_dataframe(dataframe, should_cache=False): | |||
"""Execute a ResultSet sqlaproxy using pysark module.""" | |||
"""Execute a ResultSet sqlaproxy using pyspark module.""" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fix typo
if not DataFrame and not CDataFrame: | ||
raise exceptions.MissingPackageError("pysark not installed") | ||
raise exceptions.MissingPackageError("pyspark not installed") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fix typo
# Pyspark | ||
"UNRESOLVED_ROUTINE", | ||
"PARSE_SYNTAX_ERROR", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Removed these as they are included in AnalysisException
Describe your changes
short_errors
is enabled to show only the spark sql error and not the full stack traceIssue number
Closes #1011
Checklist before requesting a review
pkgmt format
📚 Documentation preview 📚: https://jupysql--1024.org.readthedocs.build/en/1024/