🔗 Auto-register apps: Universal index discovery (#64)
* feat: Enhance CLI with improved list and smart remove commands ## ✨ New Features ### 🏠 Enhanced `leann list` command - **Better UX**: Current project shown first with clear separation - **Visual improvements**: Icons (🏠/📂), better formatting, size info - **Smart guidance**: Context-aware usage examples and getting started tips ### 🛡️ Smart `leann remove` command - **Safety first**: Always shows ALL matching indexes across projects - **Intelligent handling**: - Single match: Clear location display with cross-project warnings - Multiple matches: Interactive selection with final confirmation - **Prevents accidents**: No more deleting wrong indexes due to name conflicts - **User-friendly**: 'c' to cancel, clear visual hierarchy, detailed info ### 🔧 Technical improvements - **Clean logging**: Hide debug messages for better CLI experience - **Comprehensive search**: Always scan all projects for transparency - **Error handling**: Graceful handling of edge cases and user input ## 🎯 Impact - **Safer**: Eliminates risk of accidental index deletion - **Clearer**: Users always know what they're operating on - **Smarter**: Automatic detection and handling of common scenarios 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com> * chore: vscode ruff, and format --------- Co-authored-by: Claude <noreply@anthropic.com>
This commit is contained in:
@@ -707,20 +707,28 @@ class GeminiChat(LLMInterface):
|
||||
logger.info(f"Sending request to Gemini with model {self.model}")
|
||||
|
||||
try:
|
||||
# Set generation configuration
|
||||
generation_config = {
|
||||
"temperature": kwargs.get("temperature", 0.7),
|
||||
"max_output_tokens": kwargs.get("max_tokens", 1000),
|
||||
}
|
||||
from google.genai.types import GenerateContentConfig
|
||||
|
||||
generation_config = GenerateContentConfig(
|
||||
temperature=kwargs.get("temperature", 0.7),
|
||||
max_output_tokens=kwargs.get("max_tokens", 1000),
|
||||
)
|
||||
|
||||
# Handle top_p parameter
|
||||
if "top_p" in kwargs:
|
||||
generation_config["top_p"] = kwargs["top_p"]
|
||||
generation_config.top_p = kwargs["top_p"]
|
||||
|
||||
response = self.client.models.generate_content(
|
||||
model=self.model, contents=prompt, config=generation_config
|
||||
model=self.model,
|
||||
contents=prompt,
|
||||
config=generation_config,
|
||||
)
|
||||
return response.text.strip()
|
||||
# Handle potential None response text
|
||||
response_text = response.text
|
||||
if response_text is None:
|
||||
logger.warning("Gemini returned None response text")
|
||||
return ""
|
||||
return response_text.strip()
|
||||
except Exception as e:
|
||||
logger.error(f"Error communicating with Gemini: {e}")
|
||||
return f"Error: Could not get a response from Gemini. Details: {e}"
|
||||
|
||||
Reference in New Issue
Block a user