Skip to content

Conversation

@logan-markewich
Copy link
Collaborator

Revamps the LlamaCloudIndex significantly

  1. New LlamaCloudIndex.create_index() static function to create an empty index
  2. Improved public API for wait_for_completion
  3. Fixed several outdated API calls from the llama-cloud SDK
  4. Added a ton of async methods
  5. Pinned the llama-cloud SDK to avoid errors in the future
  6. Removed the integration test marks, since these were both a) broken and b) not being run
  7. Added more tests

@dosubot dosubot bot added the size:XL This PR changes 500-999 lines, ignoring generated files. label May 27, 2025
@logan-markewich logan-markewich merged commit 227a3da into main May 27, 2025
9 of 10 checks passed
@logan-markewich logan-markewich deleted the logan/llama_cloud_index_updates branch May 27, 2025 19:57
)
status = status_response.status
except httpx.HTTPStatusError as e:
if e.response.status_code in (429, 500, 502, 503, 504):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Feels like it shouldn't keep hammering our service if there's 5xx? Could exasperate any instability the API may be having.

self,
file_path: str,
verbose: bool = False,
wait_for_ingestion: bool = True,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should we rename to wait_for_completion just for consistency with the new method name?

@colca colca mentioned this pull request Jun 9, 2025
18 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

size:XL This PR changes 500-999 lines, ignoring generated files.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants