{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

lecture17-july22 - Assignment 3 Programs due on Sunday...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
1 Assignment 3 Programs due on Sunday Topics for today: Algorithm efficiency (2.8) Performance analysis using the Big O By increasing orders of magnitude Done at design time Does my program do what I want it to do? Does it work correctly according to the requirements given? Does the documentation describe how to use it and how it works? Is the code of good style? Is the program well-designed? Good modularity Useful encapsulation The right abstractions Is the program well tested ? How efficient is my algorithm/program? In examining algorithm efficiency we must understand the idea of complexity; 2 kinds: Space complexity Time Complexity Time complexity notations T(n) – the time that a program (or method) takes as a function of the number of data items (n) that it must input/ manipulate; exact measurement may be difficult f(n) – any function of n, generally a simpler version than T(n) with only the most dominant term included O(n) – Order of magnitude approximation of O(f(n)) which grows no faster than f(n) Space utilization – amount of memory required When memory was expensive we focused on making programs as space efficient as possible and developed schemes to make RAM appear larger than it really was (virtual memory and memory paging schemes) Space complexity is still important in the field of embedded computing (hand held computer based equipment like cell phones, palm devices, etc) Where a small footprint is required Weight, size,space and power constraints Still a big issue in databases and data warehouses Usually approximated in bytes kilo, mega, giga, tera, peta,… Or in the approximate # elements to be stored Plus the size of the metadata for DBMSs Is the algorithm computable? Is the algorithm/program “fast enough” for my needs? How much longer will the algorithm take if I increase the amount of data it must process? Given a set of algorithms that accomplish the same thing, which is the right one to choose ? How do I estimate this at design time? And measure it at run time? How do we measure time efficiency Time required to accomplish the task Time efficiency depends on : size of input data set quality of source code speed of machine(e.g. JVM) quality of compiler These vary from one platform to another In both cases: A priori estimates can be made during design A posteriori measurements can be taken when code exists
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
2 a measure of the amount of resources consumed in solving a problem of size n time space Benchmarking: implement algorithm, run with some specific input and measure time taken better for comparing performance of processors than for comparing performance of algorithms Big O (asymptotic exectution time) associates n , the problem (or dataset) size, with t , the processing time required to solve the problem
Background image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}