SLiCAPlex.py¶
SLiCAP tokenizer for netlist files.
Imported by the module SLiCAPmath.py
-
find_column(token)¶ Computes and returns the column number of ‘token’.
Parameters: token (ply.lex.token) – Token of which the column number has to be calculated. Returns: Column position of this token. Return type: int
-
printError(msg, line, pos)¶ Prints the line with the error and an error message, and shows the position of the error.
Parameters: - msg (str) – Error message.
- line – Input line with the error.
- line – str
- pos (int) – Position of therror in the input line.
Returns: out: Input line with error message.
Return type: str
-
replaceScaleFactors(txt)¶ Replaces scale factors in expressions with their value in scientific notation:
Parameters: txt (str) – Expression or number with scale factors Returns: out: Text in which scale factors are replaced with their corresponding scientific notation. Return type: str Example: >>> replaceScaleFactors('sin(2*pi*1M)') 'sin(2*pi*1E6)'
-
t_CMD(t)¶ .[a-zA-Z]+
-
t_COMMENT(t)¶ *.*|(;.*)
-
t_EXPR(t)¶ {[w()/+-^ .]}
-
t_LEFTBR(t)¶ (
-
t_PARAMS(t)¶ params:/i
-
t_PARDEF(t)¶ [a-zA-Z]w*s*=s*({[w()/+-^ .]} |([+-]?d+.?d*[eE][+-]?d+) |([+-]?d+.?d*[yzafpnumkMGTPEZY]) |([+-]?d+.d*) |([+-]?d+))
-
t_RETURN(t)¶ r
-
t_SCALE(t)¶ [+-]?d+.?d*[yzafpnumkMGTPEZY]
-
t_SCI(t)¶ [+-]?d+.?d*[eE][+-]?d+
-
t_error(t)¶
-
t_newline(t)¶ n+
-
t_t_RIGHTBR(t)¶ )
-
tokenize(cirFileName)¶ Reset the lexer, and create the tokens from the file: ‘cirFileName’.
Parameters: cirFileName (str) – Name of the netlist file to be tokenized. Returns: lexer Return type: ply.lex.lex
-
tokenizeTxt(textString)¶ Reset the lexer, and create the tokens from text input ‘textString’.
Parameters: textString (str) – Text input to be tokenized. Returns: lexer Return type: ply.lex.lex