I have a DataTable with a column containing decimal data. Immediately before saving the table into a SqlCe database in a Money column, the VS debugger shows that the table contains the full precision of my data (e.g. a value of .05421). After saving the table in the database using a DataAdapter.Update method and subsequently reading the data back into a table using a DataAdapter.Fill method, the VS debugger shows that some precision/scale has been lost (e.g. the value become .0542). (The following values are also truncated, so the issue is not that the trailing 1 in .05421 approximates to 0: .00537 and .02148)
In trying to read about decimal precision and scale, I've read things that suggest the default for decimal in C# is Precision(19) and Scale(4). That Scale fits what I'm actually seeing, coincidentally or not.
What do I need to do to protect the precision/scale of my decimal/money data? More generally, can I/how can I explicitly set the scale of my decimal values?
Steve