llama_cpp_conv

View on PyPIReverse Dependencies (2)

0.2.58 llama_cpp_conv-0.2.58-cp39-cp39-win_amd64.whl
llama_cpp_conv-0.2.58-cp39-cp39-win32.whl
llama_cpp_conv-0.2.58-cp39-cp39-musllinux_1_1_x86_64.whl
llama_cpp_conv-0.2.58-cp39-cp39-musllinux_1_1_i686.whl
llama_cpp_conv-0.2.58-cp39-cp39-manylinux_2_17_x86_64.whl
llama_cpp_conv-0.2.58-cp39-cp39-manylinux_2_17_i686.whl
llama_cpp_conv-0.2.58-cp38-cp38-win_amd64.whl
llama_cpp_conv-0.2.58-cp38-cp38-win32.whl
llama_cpp_conv-0.2.58-cp38-cp38-musllinux_1_1_x86_64.whl
llama_cpp_conv-0.2.58-cp38-cp38-musllinux_1_1_i686.whl
llama_cpp_conv-0.2.58-cp38-cp38-manylinux_2_17_x86_64.whl
llama_cpp_conv-0.2.58-cp38-cp38-manylinux_2_17_i686.whl
llama_cpp_conv-0.2.58-cp311-cp311-win_amd64.whl
llama_cpp_conv-0.2.58-cp311-cp311-win32.whl
llama_cpp_conv-0.2.58-cp311-cp311-musllinux_1_1_x86_64.whl
llama_cpp_conv-0.2.58-cp311-cp311-musllinux_1_1_i686.whl
llama_cpp_conv-0.2.58-cp311-cp311-manylinux_2_17_x86_64.whl
llama_cpp_conv-0.2.58-cp311-cp311-manylinux_2_17_i686.whl
llama_cpp_conv-0.2.58-cp310-cp310-win_amd64.whl
llama_cpp_conv-0.2.58-cp310-cp310-win32.whl
llama_cpp_conv-0.2.58-cp310-cp310-musllinux_1_1_x86_64.whl
llama_cpp_conv-0.2.58-cp310-cp310-musllinux_1_1_i686.whl
llama_cpp_conv-0.2.58-cp310-cp310-manylinux_2_17_x86_64.whl
llama_cpp_conv-0.2.58-cp310-cp310-manylinux_2_17_i686.whl
llama_cpp_conv-0.2.58-pp39-pypy39_pp73-win_amd64.whl
llama_cpp_conv-0.2.58-pp39-pypy39_pp73-manylinux_2_17_x86_64.whl
llama_cpp_conv-0.2.58-pp39-pypy39_pp73-manylinux_2_17_i686.whl
llama_cpp_conv-0.2.58-pp38-pypy38_pp73-win_amd64.whl
llama_cpp_conv-0.2.58-pp38-pypy38_pp73-manylinux_2_17_x86_64.whl
llama_cpp_conv-0.2.58-pp38-pypy38_pp73-manylinux_2_17_i686.whl

Wheel Details

Project: llama_cpp_conv
Version: 0.2.58
Filename: llama_cpp_conv-0.2.58-cp39-cp39-win_amd64.whl
Download: [link]
Size: 3299592
MD5: b9c7e3d6d798d651fb4bc6edec78f9c7
SHA256: a2ac74e6bb62c0b0d334dee4229a341d5269e4f53701ccb82e64cfda27af3ecc
Uploaded: 2024-04-01 06:13:05 +0000

dist-info

METADATA

Metadata-Version: 2.1
Name: llama-cpp-conv
Version: 0.2.58
Summary: llama.cpp (GGUF) conversion for pypi
Author-Email: xhedit <jevd[at]protonmail.com>
Project-Url: Homepage, https://github.com/xhedit/llama-cpp-conv
Project-Url: Issues, https://github.com/xhedit/llama-cpp-conv/issues
Project-Url: Documentation, https://llama-cpp-python.readthedocs.io/en/latest/
Project-Url: Changelog, https://llama-cpp-python.readthedocs.io/en/latest/changelog/
License: MIT
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Requires-Python: >=3.8
Requires-Dist: typing-extensions (>=4.5.0)
Requires-Dist: numpy (>=1.20.0)
Requires-Dist: diskcache (>=5.6.1)
Requires-Dist: jinja2 (>=2.11.3)
Requires-Dist: gguf (>=0.6.0)
Requires-Dist: uvicorn (>=0.22.0); extra == "server"
Requires-Dist: fastapi (>=0.100.0); extra == "server"
Requires-Dist: pydantic-settings (>=2.0.1); extra == "server"
Requires-Dist: sse-starlette (>=1.6.1); extra == "server"
Requires-Dist: starlette-context (<0.4,>=0.3.6); extra == "server"
Requires-Dist: pytest (>=7.4.0); extra == "test"
Requires-Dist: httpx (>=0.24.1); extra == "test"
Requires-Dist: scipy (>=1.10); extra == "test"
Requires-Dist: black (>=23.3.0); extra == "dev"
Requires-Dist: twine (>=4.0.2); extra == "dev"
Requires-Dist: mkdocs (>=1.4.3); extra == "dev"
Requires-Dist: mkdocstrings[python] (>=0.22.0); extra == "dev"
Requires-Dist: mkdocs-material (>=9.1.18); extra == "dev"
Requires-Dist: pytest (>=7.4.0); extra == "dev"
Requires-Dist: httpx (>=0.24.1); extra == "dev"
Requires-Dist: llama-cpp-conv[dev,server,test]; extra == "all"
Provides-Extra: server
Provides-Extra: test
Provides-Extra: dev
Provides-Extra: all
Description-Content-Type: text/markdown
License-File: LICENSE.md
[Description omitted; length: 113 characters]

WHEEL

Wheel-Version: 1.0
Generator: scikit-build-core 0.8.2
Root-Is-Purelib: false
Tag: cp39-cp39-win_amd64

RECORD

Path Digest Size
bin/convert-lora-to-ggml.py sha256=4MYT302v4umcsihl4cQDaIiLgeX9dyftW0VhcGJHxwk 5491
bin/convert.py sha256=BbaFTlI77iyX94ZR0rsmMn2n68IChYm9FGA1MFFGIao 62515
bin/imatrix.exe sha256=UPPjnKhp-OcpvqI3o_tCqRf20j6iAj2_Gh1AShzZej8 1174528
bin/quantize.exe sha256=sODgpfTj05GK2eimEGwXL62HKWvglSfdWjYhZaFPD3M 775680
include/ggml-alloc.h sha256=o699eP73ebFZqo5N_Wn7ygIAvMVo8FxuM7KGwJhdeN4 3071
include/ggml-backend.h sha256=EtKgkufT9cbss0OQ_h2Db39m2bxFiCEpKVlCket4pCM 13578
include/ggml.h sha256=JRHHvwHlopfJdOCz5TmlC3OQCQN3k_ECNuDXYVhxgcg 90668
include/llama.h sha256=Zu7B7csZP5VLWftiN8YMLirGTVsimmJ7xcfCVeeIa0g 48256
lib/cmake/Llama/LlamaConfig.cmake sha256=4A8uJbi285ShoZAXQOR5djGPdh_dpyaoHnEdwPEOJhI 2836
lib/cmake/Llama/LlamaConfigVersion.cmake sha256=jjwF150EMFhlH_ygLSpa0A15L5mJli5tLK1Dor6fSz0 2827
lib/llama.lib sha256=jkfMZBvpFQy645PM4xDWl1ztlq-gPmB3O2J6Udr2IYc 4965976
llama_cpp/__init__.py sha256=MLudSzhAnI6aoGKIoAZB1MZEPWhknzT2_4_RGiIdcFo 74
llama_cpp/_internals.py sha256=RgOUF0xucUx_CsC-komC5m8JRFhluVsxJMfn_TOxY3s 27480
llama_cpp/_logger.py sha256=7PAupB4SKhD45A9uterGQG5U2fM--D_twkbjWZp3wWo 850
llama_cpp/_utils.py sha256=ses-hzQ__rNj4dkjU0qd41x7glwbkeGmV9a6MQ2yMWw 2673
llama_cpp/llama.lib sha256=jkfMZBvpFQy645PM4xDWl1ztlq-gPmB3O2J6Udr2IYc 4965976
llama_cpp/llama.py sha256=Fzg9_PDjsoQ7okJ8D1uzjWa0EQz7GaQzGad4d4oed8M 83355
llama_cpp/llama_cache.py sha256=seLXjhwfaLR2-xO9f5hRT0yDRwm5MsBaEvgWGy31rIE 5112
llama_cpp/llama_chat_format.py sha256=GOongio9KMih_gPclRe-PkvJXcsPbAwt7EogLaB8NaY 109338
llama_cpp/llama_cpp.py sha256=8yDnD8Q9sUt5PEwovFKLqOKeucTAUh-ligNWnZpO67w 107992
llama_cpp/llama_grammar.py sha256=ih4ApB3siQPiyb43aI-L0A6m4_WcbNGce6Sm1cAeAks 56701
llama_cpp/llama_speculative.py sha256=LGCdrQLHYyEfwyDRrnKQyDYYAAiOgJ8MHTt8pZKFlZ4 2152
llama_cpp/llama_tokenizer.py sha256=4MhXs3dl7MqNJHMq-rmWHIj6i1n-sMIUxx_LNAPlYtA 3527
llama_cpp/llama_types.py sha256=owERcxBZW8Dx1TttGA8zms51D9n8g-lE2LdFxNcN-pM 8440
llama_cpp/llava_cpp.py sha256=9w6KP4Bc2HkwlhfhakP-QkGlrClX_FKrl8MNGOBPwA0 6817
llama_cpp/py.typed sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU 0
llama_cpp/server/__init__.py sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU 0
llama_cpp/server/__main__.py sha256=1PqRciz8w8schBc9vE8mkV42OSbZ37nFrzeAm8WFow8 2521
llama_cpp/server/app.py sha256=-xipkvVjFHiaCk4ms44vk0yKpF_6iiLLQrecGCFbDh0 18089
llama_cpp/server/cli.py sha256=YYwdhE2wwmMPP4izq9jokvsTfzJR6tpQ2eL895z_C9s 3365
llama_cpp/server/errors.py sha256=uR0MQ6I79_gm23_xE4r06FRzWeE8MrwTDQtaxZCF8Q0 7317
llama_cpp/server/model.py sha256=epcr0hFbAhJZop5aUHDgT-27g2mLRcjo2CmzNoYYdWE 7837
llama_cpp/server/settings.py sha256=9FhpZPYXtIpfMbi37o9JYiuXLuUxcPtW1vXPyNzyGeA 7166
llama_cpp/server/types.py sha256=PLpCzoLNhCBbSq_qdwLqmdGn2oRaQtRWnky9jCMFT7E 12276
llama_cpp_conv-0.2.58.dist-info/METADATA sha256=aSYrVdha8jXjc1F9EZETfIUfRzL8Rp5KirsSUh9-3L4 2020
llama_cpp_conv-0.2.58.dist-info/WHEEL sha256=5LdD53aNvB3JLNYGt7PLG057a2l0YNjBXwsz2n0i3lo 103
llama_cpp_conv-0.2.58.dist-info/entry_points.txt sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU 0
llama_cpp_conv-0.2.58.dist-info/licenses/LICENSE.md sha256=a7Uo827go9KeNgsGKT62ytKmUDnr_VPfzHxn42Fvy9c 1077
llama_cpp_conv-0.2.58.dist-info/RECORD

entry_points.txt

[empty]