Saturday, May 27, 2006

New Blog

I really don't like blogger so I have decided to setup a blog on my own server (what else are web servers good for?). The new URL is:

http://blogs.solidhouse.com/david.woods/

As such all new posts will be found at the URL above

Friday, May 19, 2006

Vista takes 15GB?

I just ran the vista upgrade advisor (tells you if you have the requirements to run vista). and was shocked to get this error:

We're sorry, but your PC cannot currently install and run the core experiences of Windows Vista.

However, you may be able prepare your computer for Windows Vista by upgrading your PC hardware.

You will need to take the following actions to run Windows Vista.

Additional hard drive storage
15GB free space required (Your computer currently has 3.53 GB)
You will either need to:
a) upgrade your hard drive to increase its capacity, or
b) create additional free space on your existing drive by removing unwanted files.
If you decide to upgrade your hard drive, we recommend 40GB capacity at minimum for premium editions of Windows Vista. Contact your PC retailer to see if an upgrade is available.


Why in the world in an OS 15GB? How many DVD's is it comming on? Or better yet how many CD's for the non DVD people.

Wednesday, May 17, 2006

Backups are important

I think backups are usually the most poorely done thing in any organization. Even if you have a backup solution it is very rarely checked to ensure that it is working.

1. How often are backups verified / tested. If possible I recommend taking a system and restoring your backup to it and see if everyone can still work.

2. How old are your tapes? They don't last forever you know. I don't know the average life but I am guessing at 1-2 years depending on rotation.

3. Do you have a rotation (i.e. a tape for every day of the week)? If yesterday is corrupt then going back 2 days is not terrible

4. Are your backups stored offsite? If not, set your building on fire as an exercise and see how well you can recover.

5. Do you have a backup server? Replacement hardware can take time to install and configure. I recommend having a spare server on site that you can restore to (or better yet clone your data to it periodically and have a spare with all the data).

Backups are not just about computers.

Do you have a backup internet connection?

Do you have a backup air conditioner? (one of my clients had all their servers go down last night as the air conditioner crapped out).

Do you have backup power? Even if it is only 15 minutes that could be all the time you need to save your work

Do you have a backup backup opperator. If only one person knows how to backup / restore then just hope they don't get sick.

Do you have your servers off the floor? Floods and computers don't mix. I had a client whos air conditioners drain pipe (to drain away the condensation) break and flooded the server room. Thankfully the servers and power connections were off the floor.

If you are not confident that you could wipe all the computers in your organization and be back at work 100% in a quick period of time (1-2 days depending on size of organization) then its time to re examine your backup solution(s).

Tuesday, May 16, 2006

Starting an application as another user

Monday, May 15, 2006

My Toolbox

-FxCop. Great tool for keeping code consitant
-Nunit. Unit testing tool
-CopyAsHTML (great for blogging / documentation)
-DataSet Quick Watch (http://www.codeproject.com/csharp/DSWatch/DSWatchSetup.zip)
Great tool for seeing a dataset in a datagrid while debugging
-Firefox
FFox plugins
-IE Tab (opens a page that is only ie compatible within firefox)
-Sage (nice little newsreader)
-Web Developer (if you do work with the web get this... period)
-Adblocker

What we are doing wrong

I recently had a rant about us as developers not having a clue what we are doing. I want to try and narow this down into a few areas that I think need improvement.

1. Testing. We don't take a step back and look at different scenarios. Whenever I develop something it is to do a task and that is what I test. I never test a method to see how it react when different data then the scenario is added.

2. UI. I never make an application useable. Its not because I hate the user but that my emphasis is on functionality. We almost need to look at the interface as requiring the most functionality and designing a good clean interface that is user friendly. I need to start designing programs thinking that the end user is going to be my grandma or something like that.

3. Data. My applications are filled with garbage data. I need to find a better way to put realistic data into my applications. One thing to be carefull of is copying production data into development as this may violate privacy laws depending on where you live (seriously).

4. Load & Concurrency. I barely ever test my applications in high load situations nor do I test what would happen if two methods got run at the same time. I ran some tests on one application and was supprised at the number of deadlocks I got in the database (then again it might have been my testing tool)

5. Design. I really think most of us don't know how to design an application. We all have different views and opinions (or are lacking in this department). I think we need to start looking at all the different ways of doing things and start to reach a concensus.

6. Client Interaction. I find that the client is not involved enough and is upset when they don't get what they asked for. This is one reason I really like agile for having the client heavily involved in the development project.

7. Unit Testing. I have found this to be such a usefull tool / practice. Since I started using it I have found that the quality of my releases has been higher in that there are few if any regression bugs and I feel more confident making changes to existing code. I really think that if you have not tried having unit tests of some kind in an application that you should.

Friday, May 12, 2006

Why ,NET applications use so much memory

You may notice that even a small application seems to consume a lot of memory on your system. I am strating to think that this is because the runtime reserves two 16mb segments for the small and large object heaps. I assume that you don't often see it as 32mb segments in task manager as a lot of it will be unused and paged to disk. Its just a theory though :)

Wednesday, May 10, 2006

Permission generation script

Here is a hany security script that will generate the grant execute / select /delete permissions on all objects in your database if you have the need to do so:



Declare @RevokeSQL varchar(1000)
Declare @GrantSQL varchar(1000)
declare @EveryoneRoleName varchar(30)

set @EveryoneRoleName='Public'

set nocount on

select P.ID, U.Name as UserName, o.name as ObjectName,
case P.ProtectType
when 204 then 'GRANT_W_GRANT'
when 205 then 'GRANT'
when 206 then 'REVOKE'
end as ProtectType,
case p.action
when 26 then 'REFERENCES'
when 178 then 'CREATE FUNCTION'
when 193 then 'SELECT'
when 195 then 'INSERT'
when 196 then 'DELETE'
when 197 then 'UPDATE'
when 198 then 'CREATE TABLE'
when 203 then 'CREATE DATABASE'
when 207 then 'CREATE VIEW'
when 222 then 'CREATE PROCEDURE'
when 224 then 'EXECUTE'
when 228 then 'BACKUP DATABASE'
when 233 then 'CREATE DEFAULT'
when 235 then 'BACKUP LOG'
when 236 then 'CREATE RULE'
end as PermissionGranted
into #Temp
from sysprotects P
inner join sysusers U on P.UID = U.UID
inner join sysobjects O on P.ID=O.ID where
P.uid=0 and o.Type<>'S' and
(O.Name not like 'sync%'
and O.Name not like 'sys%'
and O.Name not like 'dt_%')
-- just added the funky syntax for o.name filters
order by UserName, ObjectName

DECLARE cur CURSOR
READ_ONLY
FOR Select UserName, PermissionGranted, ObjectName from #Temp

DECLARE @name varchar(40)
DECLARE @ProtectType varchar(100)
DECLARE @ObjectName varchar(100)
OPEN cur

FETCH NEXT FROM cur INTO @name, @ProtectType, @ObjectName
WHILE (@@fetch_status <> -1)
BEGIN
IF (@@fetch_status <> -2)
BEGIN
--create has a slightly different syntax, so we have to branch
--here
if @ProtectType like 'Create%'
begin
set @GrantSQL ='Grant ' + @ProtectType + ' to ' + @EveryoneRoleName
print @GrantSQL
-- exec(@GrantSQL)
-- set @RevokeSQL ='Revoke ' + @ProtectType + ' on [' + @ObjectName + '] from ' + @Name
-- print @RevokeSQL
--exec(@RevokeSQL)
end
else
begin
set @GrantSQL ='Grant ' + @ProtectType + ' on [' + @ObjectName + '] to ' + @EveryoneRoleName
print @GrantSQL
-- exec(@GrantSQL)

end
END
FETCH NEXT FROM cur INTO @name, @ProtectType, @ObjectName
END

CLOSE cur
DEALLOCATE cur

--clean up the working table
drop table #Temp

set nocount off

SQL: Dynamic SQL Avoidance

I have several stored procs that take a flag that tells them wether to show expired records or not and usually looks something like this:


DECLARE @ShowExpired tinyint
SET @ShowExpired = 1

declare @sql varchar(1000)
set @sql = 'SELECT * FROM TABLE'
IF @ShowExpired = 1
set @sql = @sql + ' WHERE eff_end_dt is NULL"
execute (@sql)



While this works fine if I loose the advantages of stored procs and if I rename a table and recreate the procedure (I periodically recreate all procedures to test this) then I will not see any error.

Instead I came up with this:


DECLARE @ShowExpired tinyint
SET @ShowExpired = 1

select * from Table
where ((@ShowExpired=1) or (@ShowExpired=0 and eff_end_dt is null))

basically all the is happening if @ShowExpired=1 then everything gets returned. But if @ShowExpired=0 then only records that have a null end date are returned.

Tuesday, May 09, 2006

SQL Coding standards