When to name a variable with $



There are some symbols in the Wolfram Language that start with $, including $Abort, $MinPrecision, etc

What makes $Vars special and when is it appropriate to name a $Var (e.g. in a package as a private variable?).


Posted 2018-07-02T00:51:27.860

Reputation: 5 062

3I make most of my constants like this when doing dev work, even if they’re not going to be exposed. It’s a nice syntactic cue and has served me well. – b3m2a1 – 2018-07-02T08:52:36.163

5$Vars are not special in any way. It is merely a convention that values you might call "constants" (in a programming sense) are named this way. "Constant" here doesn't mean that they can't be set. They often can, e.g. $Assumptions. It means that it will influence the rest of your session. – Szabolcs – 2018-07-02T09:55:16.850



Look at the system context to see what the convention is. I can find 257 built-in symbols in 11.3,

symbols = Names["System`$*"];
Length @ symbols
(* 257 *)

In the system context there seems to be two main uses of the $ naming convention. It is used for static symbols without value, that can indicate e.g. evaluation status, like $Aborted, $Failed, $Canceled, etc. The other is "constants" that may have values defined at runtime, like $BaseDirectory or $FrontEnd (but not necessarily having a value at startup, e.g. $Pre or $Post by default have no values). Since their values are often settable even by the user, they are not true constants.

You don't know what fonts are available on the user's system, but you can write your code to reference $FontFamilies. Perhaps $MinPrecision is system dependent, so using that variable means you don't have to hardcode a value in your functions.

You certainly don't know $SystemID or $OperatingSystem when writing a function, but you can depend on them being defined when the functions are run.

You can look at the values for all your $ constants in a dataset,

Dataset@AssociationThread[symbols -> (Symbol /@ symbols)]

Jason B.

Posted 2018-07-02T00:51:27.860

Reputation: 58 546

I’d say rather than calling them failed constants that case is more like inert symbolic constants. I can’t remember if $FrontEndSession is like that but I think it is. They’re like symbolic stand-ins for constants/concepts that aren’t nicely encapsulated in another data structure. – b3m2a1 – 2018-07-02T08:45:33.487

That is a good point. I made the post a wiki, so please add any insights you have. Also thanks to @IstvánZachar for the clarifying edit. – Jason B. – 2018-07-02T19:35:06.137


While $__ variables are not special in terms of evaluation they are special in terms of language/package design. They alert a developer/user to the fact that $__ variables are environmental variables. Sometimes they have values extracted from an overarching process/environment and hence are constant-like (e.g.$OperatingSystem, $FrontEnd, $Version, $InputFileName, $CloudBase,$MemoryInUse etc) which can be useful for enacting relevant code bases dependent on these settings.

At other times $__ variables are designed to be user-modified by affecting the action of a suite of corresponding functions. Sometimes these settings should be done sparingly given their role in well-designed idioms (for example, $ContextPath,$Contexts,$Packages affecting and being affected by corresponding functions Get, Needs) while at other times their modification seems more routine-like (e.g.$PerformanceGoal,$MaxPrecision,$DisplayFunction,$IterationLimit).

In this latter case where they are designed to be modified, it is similar to globally setting an option value which can be convenient if this setting is likely to recur through a session/package. So, for example, the following pairs are equivalent

Get[file, Path -> $Path]


Predict[training,PerformanceGoal -> $PerformanceGoal]

So when the values of $Path and $PerformanceGoal are likely to be consistent in a given environment they can be set once initially with all subsequent applications of Get[file] and Predict[training] not needing the specific option setting.

Semantically then, $__ variables also alert the user/developer to the fact that they are being treated as global variables even without any formal interpretation (n.b. the default value of $Context is "Global`"). Generally it is considered bad practice to cavalierly introduce global variables since they tend to work against developing modularity and independence in your code base. Hence the use of $ is useful for flagging justified exceptions and hence one takeaway might be:

Don't use global variables but if you must, try and restrict to the above scenarios and always prefix with $.

(Speculating but maybe this convention is a legacy of the environmental variables of operating systems like \$Home and \$Path (n.b. $HomeDirectory and $Path) where the syntax of shell scripts specifies a leading \$ to access a named variable's value along with pedagogical treatments conveniently using pre-defined environmental variables as first examples).

Ronald Monson

Posted 2018-07-02T00:51:27.860

Reputation: 5 681

I have a long-running habit of appending $ to ad-hoc variables, that is, variables that I reuse without ClearAll. When developing iteratively in a notebook, it's important to clear variables before defining rules for them lest stale definitions from elsewhere in the Notebook leak in and fool you. But certain variables are just there for ad-hoc tests and the risk of stale definitions is negligible. Example: f$ = m * a[q,t] for a Newtonian force, where a is a carefully defined function with a ClearAll before its definition. – Reb.Cabin – 2018-07-04T04:20:22.153