AskCode API Reference
askcode.main
This file contains the definition of AskCode main class
AskCode
AskCode(
codebase_path,
language,
parser_threshold,
text_splitter_chunk_size,
text_splitter_chunk_overlap,
use_HF,
llm_model,
embeddings_model,
retriever_search_type,
retriever_k,
max_new_tokens,
temperature,
top_p,
repetition_penalty,
use_autogptq,
)
Source code in askcode/main.py
47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 |
|
setup
setup()
Sets up the Necessary components for the langchain chain
Returns:
Type | Description |
---|---|
None
|
None |
Source code in askcode/main.py
85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 |
|
load_retriever
load_retriever()
Loads the files from the codebase and sets up the retriever
Source code in askcode/main.py
101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 |
|
load_llm
load_llm()
Sets up the LLM
Source code in askcode/main.py
140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 |
|
get_prompt_template
get_prompt_template()
Sets up the prompt template
Source code in askcode/main.py
168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 |
|
chain
chain(retriever, llm, prompt_template, question)
Runs a question through Langchain Chain
Parameters:
Name | Type | Description | Default |
---|---|---|---|
retriever
|
the docs Retriever |
required | |
llm
|
the large language model |
required | |
prompt_template
|
the prompt template |
required | |
question
|
str
|
the question |
required |
Returns:
Type | Description |
---|---|
chain results |
Source code in askcode/main.py
186 187 188 189 190 191 192 193 194 195 196 197 198 199 |
|
ask
ask(question)
Ask a question to the codebase
You need to call self.setup
before calling this function
Parameters:
Name | Type | Description | Default |
---|---|---|---|
question
|
str
|
the question :) |
required |
Returns:
Type | Description |
---|---|
chain results |
Source code in askcode/main.py
201 202 203 204 205 206 207 208 209 |
|
askcode.cli
Command Line Interface
main
main(
codebase_path=".",
language="python",
parser_threshold=0,
text_splitter_chunk_size=256,
text_splitter_chunk_overlap=50,
use_HF=True,
llm_model="TheBloke/CodeLlama-7B-GPTQ",
embeddings_model="sentence-transformers/all-MiniLM-L12-v2",
retriever_search_type="mmr",
retriever_k=4,
max_new_tokens=50,
temperature=0.1,
top_p=0.9,
repetition_penalty=1.0,
use_autogptq=True,
)
Chat with your code base with the power of LLMs.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
codebase_path
|
str
|
path to your codebase |
'.'
|
language
|
str
|
programming language ['python', 'javascript'] at the moment |
'python'
|
parser_threshold
|
int
|
minimum lines needed to activate parsing (0 by default). |
0
|
text_splitter_chunk_size
|
int
|
Maximum size of chunks to return |
256
|
text_splitter_chunk_overlap
|
int
|
Overlap in characters between chunks |
50
|
use_HF
|
bool
|
use hugging face models, if False OpenAI models will be used |
True
|
llm_model
|
str
|
Large language model name (HF model name or OpenAI model) |
'TheBloke/CodeLlama-7B-GPTQ'
|
embeddings_model
|
str
|
Embeddings model (HF model name or OpenAI model) |
'sentence-transformers/all-MiniLM-L12-v2'
|
retriever_search_type
|
str
|
Defines the type of search that the Retriever should perform. Can be "similarity" (default), "mmr", or "similarity_score_threshold". |
'mmr'
|
retriever_k
|
int
|
Amount of documents to return (Default: 4) |
4
|
max_new_tokens
|
int
|
Maximum tokens to generate |
50
|
temperature
|
float
|
sampling temperature |
0.1
|
top_p
|
float
|
sampling top_p |
0.9
|
repetition_penalty
|
float
|
sampling repetition_penalty |
1.0
|
use_autogptq
|
bool
|
Set it to True to use Quantized AutoGPTQ models |
True
|
Returns:
Type | Description |
---|---|
None |
Source code in askcode/cli.py
35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 |
|