N00b wanna learn more about database programming

Discussion in 'Programming & Software Development' started by xhanatos, Jul 9, 2007.

  1. Luke212

    Luke212 Member

    Joined:
    Feb 26, 2003
    Messages:
    10,041
    Location:
    Sydney
    generics are good because they allow more maintainable code.. type-safe compile-time checking. they also do not require a cast so they are faster.

    edit: CollectionBase is good but it is not typesafe and errors can be harder to avoid or find.
     
    Last edited: Jul 18, 2007
  2. Ze.

    Ze. Member

    Joined:
    Sep 13, 2003
    Messages:
    7,871
    Location:
    Newcastle, NSW
    I agree with you about in generics in regard to compile time check but whether they use casting or not is dependent upon the implementation.
     
  3. Luke212

    Luke212 Member

    Joined:
    Feb 26, 2003
    Messages:
    10,041
    Location:
    Sydney
    ok all that is fair enough. we will end up arguing definitions. after all there is no such thing as optimised code. it is an infinite scale.

    So to summarise my point,
    there is no end to a large software project. Someone has to maintain it. Requirements always change, and this makes less optimised code more agile and flexible.

    Optimisation (the process of) => Specificity => Rigidity
    Optimisation (the process of) => Lower Level Code => Less Safe & Less Readable

    horses for courses? ;)
     
  4. QuakeDude

    QuakeDude ooooh weeee ooooh

    Joined:
    Aug 4, 2004
    Messages:
    8,538
    Location:
    Melbourne
    There you go xhanatos, you want to learn what not to do, check out the last few pages.. :lol:
     
  5. bugayev

    bugayev Whammy!

    Joined:
    May 15, 2003
    Messages:
    4,093
    Location:
    Melbourne
    Optmisation (the process of) => Reviiew => Improved quality
    Optimisation (the process of) => Better Code => Safer => More Readable => More Logical => More Agile

    You seem to misunderstand what optimization is really about. It's not re-writing the code to make it n% more hardcore, it is about reviewing the code, highlighting areas which are in need of re-work because they are less efficent and affect performance, and making those changes. If the code is performing well it does not need re-factoring.

    Going in and making piecemeal changes under the name of optmisiaation is a total waste of time.

    Requirements change, maintenance is always required. If you have a set of documented standards which lay out the way certain things should be done, then maintaining code becomes much easier.

    Just because something takes less time to edit because its badly written does not equate to "more agile".

    As for a definition of optimised code:

    optimization is the process of modifying a system to make some aspect of it work more efficiently or use fewer resources

    seems pretty good to me :)
     
  6. Ze.

    Ze. Member

    Joined:
    Sep 13, 2003
    Messages:
    7,871
    Location:
    Newcastle, NSW
    That definition seems pretty good to me :)

    heck optimisation is often a trade off where you trade one performance characteristic for another (such as space for time or searching for insertion).
     
  7. Luke212

    Luke212 Member

    Joined:
    Feb 26, 2003
    Messages:
    10,041
    Location:
    Sydney
    I'm not misunderstanding. I haven't said contrary to the above. I haven't once made a claim to authority (ok once), and you're just pissing me off by talking down to me like that.
     
  8. bugayev

    bugayev Whammy!

    Joined:
    May 15, 2003
    Messages:
    4,093
    Location:
    Melbourne
    I apologize if you feel I'm talking down to you. I am presenting my opinion, and something I feel very strongly about and have a great deal to do with in my current job is identifying areas of code which are crying out for optimization!

    :)
     
  9. GumbyNoTalent

    GumbyNoTalent Member

    Joined:
    Jan 8, 2003
    Messages:
    9,743
    Location:
    Briz Vegas
    Optimization doesn't necessarily mean using less, performance optimization can actually use more resources to achieve quicker response times.

    EDIT :: While we on the topic, perhaps one of the database gurus can answer me an indexing question.

    On a hashed table do queries work faster if the key is numeric or alpha or alphanumeric or does it make no difference?

    And same question with indexes on keys?

    In particular MySQL isam.
     
    Last edited: Jul 18, 2007
  10. xsive

    xsive Member

    Joined:
    Jun 29, 2001
    Messages:
    4,343
    Agree totally with this, however I can't help but feel you're arguing two different things here: conceptual optimisation and code optimisation.

    You don't optimise your code for speed from the beginning; you try and solve the problem in the most reasonable way that meets some set of requirements. There will be lots of opportunities to design the most appropriate architecture/component/class and a good developer should always consider these things up front. What results is conceptually optimised code that might well be fast but that was probably written the way it was because it was a good fit to the nature of the problem and because it follows some best practice; not because it's striding through memory in the most efficient way.

    The process of going in and changing code with the sole purpose of making it execute more efficiently comes much later -- usually during integration testing when you try to measure how well your system is shaping up to meet requirements. Incidentally, those code optimisations do result in more obfuscated source (for example, loop unrolling is messy) but these changes should be the last resort in the quest for more performance.

    Solve the problem in some conceptually optimal way and there's far less chance you'll screw things up trying to make it execute quickly.
     
  11. bugayev

    bugayev Whammy!

    Joined:
    May 15, 2003
    Messages:
    4,093
    Location:
    Melbourne
    I recall an article on mySQL from some time ago that indicated that a numeric lookup/comparison will be faster than anything that's a string. That may be a complete load of crap these days, but...
     
  12. hyperstyle

    hyperstyle Member

    Joined:
    May 24, 2003
    Messages:
    1,731
    Location:
    Brisbane
    It's true, in the case of MySQL anyway. But usually i don't think it makes fuck all difference. There's only been one occasion that i've gained a performance increase worth worrying about. (usually nothing to .001seconds). That one occasion was strange and i think it may have had to do with memory limits/cache limits as well..?. It had about a million rows mixing with a few other tables but only slightly larger than tables it gave no increase to. I learnt my lesson anyway and from now on use numeric primary/foreign id's where possible, especially on large tables.
     
    Last edited: Jul 18, 2007
  13. Ze.

    Ze. Member

    Joined:
    Sep 13, 2003
    Messages:
    7,871
    Location:
    Newcastle, NSW
    I view it as a holistic approach to development keeping in mind what is needed from the outset. Hence why I dislike the rule of implement first and optimise later , it means we aren't looking at the whole problem but only a subset of the constraints.

    One keeps in mind what is needed and allows room for optimisations to be made where necessary if they are kept in mind as possibilities from the outset then we have far less of hack the code in here syndrome.
    Of course Ideally we want to save ourselves from having to do some optimisations and getting right down into the nitty gritty unless necessary , if we choose the right architecture and design going to drastic levels won't be as necessary.
     
  14. Luke212

    Luke212 Member

    Joined:
    Feb 26, 2003
    Messages:
    10,041
    Location:
    Sydney
    this is all airy fairy design stuff and has nothing to do with actual coding :p

    once youve finished your planning how about you write some code and then we can think about going back and tuning it.
     
  15. xsive

    xsive Member

    Joined:
    Jun 29, 2001
    Messages:
    4,343
    Holistic, certainly provided your efforts are limited to conceptual optimisation. It's utterly pointless focusing on improving execution efficiency before you've really solved the problem. The biggest gains come from an implementation that exploits opportunities afforded by the theoretical nature of the problem and not from improving the efficacy of execution.

    Right. Except that spending time crafting a system with perfect memory (for example) will be time wasted if you've already met your performance requirements in other ways (like through a well thought out higher level design). Quality is about fit for purpose; avoid premature low-level optimisation until an specific need arises and then focus your efforts on the bits that will provide most benefit.

    Far better to rewrite small methods to make them go fast than to try and build some needlessly optimised (and needlessly obfuscated because we've done such a great job) super solution that wasn't asked for anyway.

    <edit> Incidentally, noone is suggesting that the first-cut method implementation are hacks. You still write quality code no matter what; all I'm advocating is that this quality is not always measured using a performance-based metric </edit>

    Exactly. In the wise words of Donald Knuth, "Premature optimisation is the root of all evil (or at least most of it) in programming".
     
    Last edited: Jul 18, 2007
  16. cletus

    cletus Member

    Joined:
    May 25, 2005
    Messages:
    672
    Personally I think a better definition of optimisation is:

    The process of making a tradeoff of a less desirable quality to a more desirable one.

    Go back 10+ years ago and you'll find people would sometimes optimise programs for size not speed ("optimisation" is synonymous with speed these days), particularly when you did programming for tiny/small/medium memory models (who here is old enough to know what I mean by that?). Size is still an issue with embedded systems though I guess.

    Likewise, a program optimised for speed will typically use more not less resources eg implementing an over-eager in-memory caching system to speed up database accesses.

    Size vs speed is a traditional tradeoff but you can add cost/complexity/risk to that too (basically the same thing). An optimsed program is typically more complicated (ie most costly/risky) than a naive implementation.
     
  17. Elyzion

    Elyzion Member

    Joined:
    Oct 27, 2004
    Messages:
    7,449
    Location:
    Singapore
    Was looking for some stuff on Case vs If Statement in SQL Server and found this:

    http://www.sql-server-performance.com/stored_procedures.asp

     
    Last edited: Jul 19, 2007
  18. GumbyNoTalent

    GumbyNoTalent Member

    Joined:
    Jan 8, 2003
    Messages:
    9,743
    Location:
    Briz Vegas
    Ok here is the lowdown MyISAM doesn't use hashed indexes (1 for me not my colleague) so there fore numeric indexes are quicker. :)

    Interestingly InnoDB looks like the king for webapps which don't delete (another one of my rules :)) but update a flag.
     
  19. hyperstyle

    hyperstyle Member

    Joined:
    May 24, 2003
    Messages:
    1,731
    Location:
    Brisbane
    That's a bit of a sharp rule. Just updating a flag rather than deleting a row totally depends on the situation. It's nice for history and will keep your MyISAM tables tidy but only if the table does not do a lot of inserts and 'deletes'. Too much waste data will slow your table down. If history isn't important and i don't want to give the database user used by the application delete privileges (which is a good idea) i run a routine clean up script as appropriate.
     
  20. GumbyNoTalent

    GumbyNoTalent Member

    Joined:
    Jan 8, 2003
    Messages:
    9,743
    Location:
    Briz Vegas
    Deleting from a highly active table causes grief regardless the DB in question, and hot/live deletion is just plains nuts in my field, banking.
     

Share This Page

Advertisement: