Skip to content

Conversation

@iofu728
Copy link
Contributor

@iofu728 iofu728 commented Mar 11, 2025

What does this PR do?

Fixes bf16

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Was this discussed/approved via a Github issue? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests?

Who can review?

@iofu728

@iofu728 iofu728 added the bug Something isn't working label Mar 11, 2025
@iofu728 iofu728 requested a review from Copilot March 11, 2025 09:59
@iofu728 iofu728 self-assigned this Mar 11, 2025
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PR Overview

This PR addresses bf16 support improvements in LLMLingua. Key changes include:

  • Updating the copyright header to include 2025.
  • Removing a redundant .numpy() call in the prompt compressor, likely to streamline bf16 processing.

Reviewed Changes

File Description
llmlingua/prompt_compressor.py Removed the .numpy() conversion in sentence packaging logic to support bf16

Copilot reviewed 1 out of 1 changed files in this pull request and generated no comments.

Comments suppressed due to low confidence (1)

llmlingua/prompt_compressor.py:1337

  • Removing the .numpy() call assumes that the tensor returned by get_condition_ppl is a single-element tensor. Please verify that this change does not lead to runtime errors when processing bf16 values.
.numpy()

@iofu728 iofu728 merged commit e4e172a into main Mar 11, 2025
1 of 9 checks passed
@iofu728 iofu728 deleted the hjiang/support_bf16 branch March 11, 2025 10:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants