Skip to main content
nikos101
Inspiring
January 16, 2010
Answered

convert the money datatype to a 2 decimal point format.

  • January 16, 2010
  • 2 replies
  • 8829 views

Whats the best way to convert the money datatype to a 2 decimal point format in ms sql 2005 for use in my applications.

this?

CAST(tr.depositReceivedAmount AS decimal(10 , 2))

This topic has been closed for replies.
Correct answer TLC-IT

I respectfully disagree with the notion that you should change the SQL column from a 'money' data-type to something else.

In most database servers, 'money' is a data type that is designed to provide very consistent behavior with regard to arithmetic accuracy.  In Microsoft Access, the representation is a scaled-integer.  In MS SQL Server, it is obviously similar.  Ditto Oracle and all the others.

You want the money data-type in the database to have this accuracy, because "hell hath no fury like an accountant in search of one lousy penny."   The database column storage-formats are designed to satisfy accountants, and that is a Good Thing.

Meanwhile, you also want to take care as to exactly how you deal with the values.  There are several points where rounding could take place.  You do not have at your disposal the strongest possible handling of floating data-types in ColdFusion.  You are also somewhat at the mercy of whatever interface software may lie between you and whatever SQL server you may use.  "It's okay to round values once, but not multiple times."

I suggest rounding the value right before display, and stipulating that the user's input must be two decimal places.

Then, you might have to do some things at the SQL server's end.  For instance, when you update a value in the table, you may need to use server-side logic to explicitly truncate the value to two decimal-points, so that an update of "$34.56" explicitly updates the column to "$34.5600."  (This sort of thing has to happen within the SQL server context.)  You know that the user's input has exactly two significant digits, but maybe (maybe not...!) the SQL server might not know this.  You want to ensure that the server's internally-stored value represents exactly two significant digits, when the value originates from a user-input.

Don't err on the side of "your convenience" or "what looks good on-screen."  (If you do, get ready to get phone-calls from the accountants, always at inopportune hours of the night.)

2 replies

TLC-ITCorrect answer
Inspiring
January 18, 2010

I respectfully disagree with the notion that you should change the SQL column from a 'money' data-type to something else.

In most database servers, 'money' is a data type that is designed to provide very consistent behavior with regard to arithmetic accuracy.  In Microsoft Access, the representation is a scaled-integer.  In MS SQL Server, it is obviously similar.  Ditto Oracle and all the others.

You want the money data-type in the database to have this accuracy, because "hell hath no fury like an accountant in search of one lousy penny."   The database column storage-formats are designed to satisfy accountants, and that is a Good Thing.

Meanwhile, you also want to take care as to exactly how you deal with the values.  There are several points where rounding could take place.  You do not have at your disposal the strongest possible handling of floating data-types in ColdFusion.  You are also somewhat at the mercy of whatever interface software may lie between you and whatever SQL server you may use.  "It's okay to round values once, but not multiple times."

I suggest rounding the value right before display, and stipulating that the user's input must be two decimal places.

Then, you might have to do some things at the SQL server's end.  For instance, when you update a value in the table, you may need to use server-side logic to explicitly truncate the value to two decimal-points, so that an update of "$34.56" explicitly updates the column to "$34.5600."  (This sort of thing has to happen within the SQL server context.)  You know that the user's input has exactly two significant digits, but maybe (maybe not...!) the SQL server might not know this.  You want to ensure that the server's internally-stored value represents exactly two significant digits, when the value originates from a user-input.

Don't err on the side of "your convenience" or "what looks good on-screen."  (If you do, get ready to get phone-calls from the accountants, always at inopportune hours of the night.)

nikos101
nikos101Author
Inspiring
January 18, 2010

Many thanks for that, the main reason was that I heard money was being deprecated

The only trouble I could see is that there could be a currency with more than 100 lower denominational currency units with numeric (18,2)

Inspiring
January 16, 2010

What happened when you tried it?


nikos101
nikos101Author
Inspiring
January 16, 2010

it works but I wasn't sure if that was the best way to do it, space /speed

considerations

best wishes

Nikos

Inspiring
January 16, 2010

The best way to do it is to change the datatype of the column, if you are able to.