You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We encountered an issue while installing pyspark in a devenv-deployed environment from PyPI using pdm. Upon running the pyspark command, we consistently faced the following error:
RuntimeError: Java gateway process exited before sending its port number
Details:
The error only occurs on Linux devices, including NixOS and Linux-based CI environments. The same setup works fine on macOS devices.
We have used similar setups in other projects without any issues, which makes this occurrence even more peculiar.
What We Tried:
Devenv Spark Installation: Initially, we attempted to make the setup work by installing Spark using devenv. However, the provided version was incompatible with our needs (we required Spark version 3.3.0 due to compatibility requirements with AWS Glue).
Custom Package Attempt: We attempted to create a custom package, but due to the performance issues with the Apache archive server, we had to abandon this approach.
Switch to Poetry: We also tried switching to poetry to manage the devenv environment, and even directly installed it into our virtual environment. However, the error persisted across all configurations.
Nixpkgs Configuration: Finally, I noticed that the nixpkgs configuration in devenv was pointing to the new devenv-nixpkgs. This was the primary difference from our previous working setup. By switching back to nixpkgs-unstable, the problem was resolved.
Conclusion:
It appears that the issue is related to the patches or configurations applied in the rolling devenv-nixpkgs package. Unfortunately, I am not certain about the specific patches or changes that might be causing this issue.
For reference, you can see the devenv configuration where this setup fails in this revision of our project and here is the link to the failing CI.
Request for Assistance:
Could you please investigate whether there are any specific changes in the devenv-nixpkgs that could be causing this incompatibility with PySpark on Linux environments? Any insights or suggestions would be greatly appreciated.
Thank you!
The text was updated successfully, but these errors were encountered:
shahinism
added a commit
to DataChefHQ/inception
that referenced
this issue
Sep 5, 2024
Description:
We encountered an issue while installing
pyspark
in adevenv
-deployed environment from PyPI usingpdm
. Upon running thepyspark
command, we consistently faced the following error:Details:
What We Tried:
Devenv Spark Installation: Initially, we attempted to make the setup work by installing Spark using
devenv
. However, the provided version was incompatible with our needs (we required Spark version 3.3.0 due to compatibility requirements with AWS Glue).Custom Package Attempt: We attempted to create a custom package, but due to the performance issues with the Apache archive server, we had to abandon this approach.
Switch to Poetry: We also tried switching to
poetry
to manage thedevenv
environment, and even directly installed it into our virtual environment. However, the error persisted across all configurations.Nixpkgs Configuration: Finally, I noticed that the
nixpkgs
configuration indevenv
was pointing to the newdevenv-nixpkgs
. This was the primary difference from our previous working setup. By switching back tonixpkgs-unstable
, the problem was resolved.Conclusion:
It appears that the issue is related to the patches or configurations applied in the rolling
devenv-nixpkgs
package. Unfortunately, I am not certain about the specific patches or changes that might be causing this issue.For reference, you can see the
devenv
configuration where this setup fails in this revision of our project and here is the link to the failing CI.Request for Assistance:
Could you please investigate whether there are any specific changes in the
devenv-nixpkgs
that could be causing this incompatibility with PySpark on Linux environments? Any insights or suggestions would be greatly appreciated.Thank you!
The text was updated successfully, but these errors were encountered: