Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG]: The mistral pipeline command is missing #280

Open
clafollett opened this issue Jan 17, 2025 · 0 comments
Open

[BUG]: The mistral pipeline command is missing #280

clafollett opened this issue Jan 17, 2025 · 0 comments
Labels
bug Something isn't working

Comments

@clafollett
Copy link

clafollett commented Jan 17, 2025

Bug description

I attempted to run the advertised demo on the Modular site for the Mistral pipeline but receive the following output stating the mistral command does not exist:

23:39:53.958 INFO: MainThread: root: Logging initialized: Console: 20, File: None, Telemetry: None
Usage: pipelines.py [OPTIONS] COMMAND [ARGS]...
Try 'pipelines.py --help' for help.

Error: Command not supported: mistral
Supported commands: generate, list, llama3, replit, serve

I have a potential fix in a fork but I do not believe I have it configured correctly. I get a bf16 datatype error on my Macbook Pro M3 Max. I'm fairly new in the AI world and assume it's some missing transform config?

error: The bf16 data type is not supported on device 'cpu:0'.

@main.command(name="mistral")
@pipeline_config_options
@common_server_options
@click.option(
    "--prompt",
    type=str,
    default="Why is the sky blue?",
    help="The text prompt to use for further generation.",
)
@click.option(
    "--num-warmups",
    type=int,
    default=0,
    show_default=True,
    help="# of warmup iterations to run before the final timed run.",
)
@click.option(
    "--serve",
    type=bool,
    default=False,
    is_flag=True,
    show_default=True,
    help="Whether to serve an OpenAI HTTP endpoint on port 8000.",
)
def run_mistral(
    prompt,
    num_warmups,
    serve,
    profile_serve,
    performance_fake,
    batch_timeout,
    model_name,
    **config_kwargs,
):
    """Runs the Mistral pipeline."""

    # Update basic parameters.
    if config_kwargs["architecture"] is None:
        config_kwargs["architecture"] = "MistralForCausalLM"

    if config_kwargs["architecture"] != "MistralForCausalLM":
        msg = (
            f"provided architecture '{config_kwargs['architecture']}' not"
            " compatible with Mistral."
        )
        raise ValueError(msg)

    config_kwargs["trust_remote_code"] = True

    config = PipelineConfig(**config_kwargs)

    # if config.quantization_encoding not in [
    #     SupportedEncoding.bfloat16
    # ]:
    #     config.cache_strategy = KVCacheStrategy.NAIVE

    if serve:
        serve_pipeline(
            pipeline_config=config,
            profile=profile_serve,
            performance_fake=performance_fake,
            batch_timeout=batch_timeout,
            model_name=model_name,
        )
    else:
        generate_text_for_pipeline(
            pipeline_config=config, prompt=prompt, num_warmups=num_warmups
        )

Steps to reproduce

  • Include relevant code snippet or link to code that did not work as expected.
  • If applicable, add screenshots to help explain the problem.
  • Include anything else that might help us debug the issue.

Simply follow the steps outlined here:
Mistral NeMo

Review the code file for the pipeline definitions and see the command definition is not there:
https://github.com/modular/max/blob/main/pipelines/python/pipelines.py#L141-L266

System information

- What OS did you do install MAX on ?
macOS Sequoia 15.2

- Provide version information for MAX by pasting the output of max -v`
- Provide version information for Mojo by pasting the output of mojo -v`
- Provide Magic CLI version by pasting the output of `magic -v`
magic 0.6.2 - (based on pixi 0.40.0)
@clafollett clafollett added the bug Something isn't working label Jan 17, 2025
clafollett added a commit to clafollett/modular-max that referenced this issue Jan 17, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant