Showing posts with label Databases Category. Show all posts
Showing posts with label Databases Category. Show all posts

Friday, May 29, 2009

Visual FoxPro is still here, still very relevant

I am writing this article because I read a very interesting article about Visual FoxPro on SD Times web site! The title of the article is ‘Where have you gone, Visual FoxPro?’ Even though the Article did some justice to the FoxPro Product, and its great contributions and the many product milestones, I still thought that the basic question posed by the article had to be answered. This is important to many of us who still use Visual FoxPro to architect and deliver great MIS Products because as far as we are concerned, Visual FoxPro is still very much here!

So what then can we say is Visual FoxPro’s place in an enterprise application architecture today? Is it relevant when compared to other alternative offerings from Microsoft – the company that makes Visual FoxPro! This article attempts to document our experiences as a company that sells many custom-developed application software products made with virtually all of Microsoft’s Database Offerings (Ms-SQL Server, Ms Access and Of course Ms Visual FoxPro)!

I first started using FoxPro in 1992! Then it was version 1.01 running on a PCMOS 4.1 Network. I was fresh out of school with some little programming knowledge on GW-Basic, MF Cobol and dBASE III! The Mortgage Bank I joined and worked for ran its Banking Software (Swift Banking System) written with Clipper 5 on a FoxPro Database. At the time, most people wrote their programs on dBASE IV and then had those programs compiled with FoxPro or with Clipper! I fell in love with FoxPro immediately because it was so easy to use and had a much more polished interface than dBASE IV and what is more, I was also very impressed that a package as huge as Banking was running on a FoxPro box and was very stable in indeed!

Then in 1993, I left the banking sector and did not again use FoxPro until 2001 when I taught ‘Building solutions with Visual FoxPro 6.0’ to Students at MicroLink College (then the school had the option to use either Microsoft Access or Visual FoxPro to demonstrate Database Concepts)! It was during this time that I started exploring the idea of developing a powerful, versatile, modern and affordable MIS Package for Colleges and Universities since the college at that time had no computerized data storage and retrieval system – that package would eventually today become CampusManager™ University Advantage – still in commercial distribution and in use among several University Colleges in Ethiopia today (and running as a pure Fox solution).

The requirement was for a system that is feature –rich, user friendly, provided reasonable storage capacities for the sort of volumes of data that could be generated by a school or university college and was yet affordable by schools in the third-world. The combined features of an affordable and feature-rich system was necessary because most of the clients that buy our system can be considered as small or medium-scale enterprises that will ‘balk’ at the cost involved in licensing Ms SQL Server or Oracle Databases. Also, the programs would be running in an environment where these organizations may not necessary afford thousands of dollars to retain trained SQL Server or Oracle DBA’s so the system had to be light, easy to run and still provide all the storage capacity and power of an enterprise solution built to run on SQL Server or Oracle!

This reality for us was underscored by the fact that most of the Visual Basic programs we wrote that were designed to run with MS SQL Server; have always suffered from customer (small to medium scale organizations) complaints - Managers do not simply find that purchasing expensive Ms SQL Server database licenses is somewhat prohibitive! At the same time, Microsoft Access (Jet Database Engine)/MSDE did not present an attractive alternative because of the 2GB File Size limit that would sooner rather than later force a mid-sized company to look for alternative storage. Add to this the fact that in a typical school, we would have more than 10 users working at a time and Access’s well documented behavior when 10 (ten) or more users are accessing the database concurrently! To summarize, our ‘business parameters’ for selecting an application development tool for our CampusManager™ Offering was (and still is) as follows:

* System must be feature rich
* System must provide reasonable storage capacity for at least 5 years for a school or college with about 2000 student population
* System must be capable of supporting at least 20 concurrent users without signs of stress (imagine that at least 15 homeroom teachers at a time have to record marks and enter grading information during peak period such as exam time concurrently along with other routine activities by staff and faculty). Our first site, a premium private school boasted 60 homeroom teachers (Grades 1 to 12 each with five sections A to E)
* System has to be capable of routine performance on just about any environment (the first school we sold to had a P3 Server with Two Processors) since in our market, most clients may just have a PCs and may never even have heard of Servers
* System must be light and fast and be able to be easily, deployed, configured and implemented in the end-users environment
* Total cost of ownership for the system must be kept to a minimum. For example, can users easily administer the database (most of our clients may have a teacher or two with a background in simple applications and so would not possess the type of skills required to manage a typical SQL Server Database). FoxPro in contrast is easy and fun to learn!
* System must be affordable. If our clients can pay not only for our system but also license the database engine for online database maintenance, it is great for them and for us!

So for us it is against this backdrop that we realized that we had a copy of Visual FoxPro bundled with our Visual Studio 6.0 suite of enterprise development tools. After carefully assessing our options, we came to the conclusion that Visual FoxPro 6.0 was the best tool in the Visual Studio suite (we had been concentrating on Visual Basic and Visual C++ against Jet/SQL Server for our other applications) to build a fast, data-centric MIS system for schools, colleges and universities because it just about offered the right combination of power, speed, storage capacity and licensing cost for the market we wanted to target with our system. For us, this meant that we would not have to ‘fight’ with our clients about expensive database licensing nor would we have to explain why our clients need to move to a more expensive Database Server after just 2-3 years. At the same time, should the need arise, we can easily upgrade the system to run on a bigger database engine (such as SQL server or the Advantage Database Server that now provides full Visual FoxPro Compatibility) without expensive code rewrites!

Since our software was released in June of 2006, we have upgraded to using Visual FoxPro 9.0 and have found that our product has elicited quite a lot of interest because of the attractive cost of the package and licensing terms for a VFP Database (optional). But whether our clients accept to buy VFP Database licenses or not (in which case they can use the VFP runtime), our product is still a money-spinner for Microsoft since our clients will still need to license Microsoft Windows to run the application; plus, they always license the application with a promise to upgrade to Ms SQL after a number of years.

A couple of months after SD Times posed the question “Where have you gone Visual FoxPro?”; we want to supply an emphatic answer – “Visual FoxPro is still here today! Visual FoxPro is still very relevant in the enterprise!” Here in Africa, we are using Visual FoxPro to build modern, powerful, object-oriented and affordable applications for fast growing companies!

And yes! Even though Microsoft says they shall make Visual FoxPro no longer, the product can only get better as many members in the community work to make great add-ons and improvements in the product, adding badly needed functionality. With some of the cool tools we have found on CodePlex such as VFPX and most especially ActiveVFP, we can say that we are in no great hurry to re-write our application on the .NET framework anytime soon

Get Rid of Porn-how to Get Rid of Porn From Your Computer

Get rid of porn today! This is important to get rid of porn from your hard drive or it can lend you in some serious trouble! Sometimes while you browse the internet porn fragments and popups install themselves into your registry and it is almost impossible to get rid of this stuff. They always popup on your screen and do generally have viruses and spy ware attached to them! Whether you downloaded the porn yourself or it entered the computer by itself you need to get rid of porn fast!!!

The problem with internet porn fragments is it burries itself deep inside your hard drive and it makes getting rid of porn almost impossible task. Once it is on your hard drive you will NOT get rid of it wihout special porn deletion software! Any porn fragments can be traced back and found on your hard drive no matter how hard you try to get rid of it and this can cost you a pretty penny if you get traced by computer forensics and it can lend you in jail for illegal pornography! This is exactly why you need to get rid of porn now and Porn Terminator Porn Cleaner Works!

I had constant porn popups when i was browsing the internet and my hard drive and registry was fully littered with porn so i had to find a way to get rid of porn from my harddrive and i stumbled upon The Porn Terminator and i decided to give it a shot because i had no time to lose and i didn't want to lose my family or my job because my hardrive was littered with porn. I bought the software and downloaded it and within about 15 minutes i got rid of all porn that was inhibiting my computer, so getting rid of porn is possible and it can be done with the right software! I got rid of all porn on my harddrive and i encourage you to do thesame! Why risk everything you love

An Innovative Database Design Method

This article is intended to give you a single idea that we think is innovative and that you can hopefully use to spur on your design and development tasks and help you build a database system that will meet most if not all of your needs.

The Problems Encountered at the End of a Project

Toward the end of a project, just as you are about to go live with your new database system is often when things can go wrong forcing you to delay the introduction of your new database system.

For this reason, thinking ahead with the notion of meeting as many of the needs your company can use and preparing the ground work for documenting new requirements as they unfold and avoiding a troublesome ending should be at the forefront of your thoughts.

You can do something to prevent this from happing by planning ahead and by setting up an innovative method of documenting the design, development and deployment process with a separate dynamic database model.

The Solution

It all begins by documenting your new database feasibility study and your systems analysis effort and incorporating these documents into your final set of design notes that capture your business requirements, business logic and transactional data into the complete package.

The entire set of design notes could now be inserted into a separate database identified as the “Design Notes Metadata Database” or some similar title. Each table could relate to a major project phase such as requirements gathering, business data and logic collecting, table design and development, test data accumulation, testing, etc.

Each record within every table could relate to a feature of the new database and each field within that metadata record could describe a component of the feature. The record descriptions would remain intact through out the entire design, development, test, deployment and operation. They would in addition remain available for continual update and reference for new employees just becoming familiar with the application.

The entire process can contribute enormously to clarifying data during development and later when maintenance or troubleshooting needs to be performed, or new features need to be added. It would be a simple matter to query the metadata database and discover necessary information you can use to take the appropriate action.

Because all the design notes are embedded in a separate metadata database and because this represents a dynamic maintainable database that is updated during all phases of the project – you should not find yourself in trouble at the end of your project.

Better Data Acquisition with More Articulate Database Designs

Environmental data and life science studies are heavily dependent on data acquisition systems and these in turn are becoming more noticeably dependent on well designed collection systems best represented by database systems. The more articulate these database systems become the more accurate and plentiful does knowledge in these areas rise.

The environment and life science data collected thus far in history is beginning to become richer in terms of metadata and mapping strategies for building more dynamic databases. These new inputs help design closer phraseologies matching the scientific, engineering and technical scenarios necessary to build each new generational model.

Computers have long ago outperformed the promises made by database applications that suggested we would be able to easily collect various real world signals and waveforms and easily extract critical information for subsequent use as a systematic controlling agent for dynamic expansion of knowledge acquisition.

But as our data acquisition has grown exponentially through better sensors and more elaborate instrumentation, our knowledge acquisition has not. That is, until maybe, now. The heart of the solution might be the database system concepts slowly emerging and their more articulate reach into semantic scenario measurement systems.

Sensors working in conjunction with instruments all tied to a data logger provide the measurement configuration. Data loggers are electronic devices that collect data over time or in a specific radius.

These immediate flat data files connected to data loggers house the raw sensor signals. Their readings are periodically transferred to simple but more secure databases which in turn contribute to a knowledge database management system as more data feeds into relational and hierarchical databases.

The Knowledge base grows as the feed into relational and hierarchical database management systems grow to provide level-tag assignments and assist in the eventual manipulation that will be necessary for successful extraction of meaningful real world data to help develop an insight in understanding the environment.

Through the use of a human interface and because the databases mentioned above are so well mapped with predetermined location and expectation tags and are strongly aided with relevant metadata, a system of data highlighted reports are produced that help suggest different kinds of conclusions both scientific technical for practical engineering purposes.

This process continues to evolve into more intricate patterns as database designs become more articulate in their abilities to store, compare, manipulate and report data.

Database Management System Types and their Characteristics

Database management has become important part of every company that has got data to be managed and handled. Server databases and desktop databases are two types of database management systems. The desktop one is concerned with single-user applications and is connected with standard personal computers.

Whereas the server database is mainly connected with multi-user applications and have greater reliability and data consistency. It is costlier than desktop database and operates on high performance servers.

Any website should not blindly dive into the conclusion of database design selection, proper pre-analysis and research is necessary. Sometimes it happens that you decide to buy expensive server database but when calculating your business requirements, desktop database seems to be proper. Also many times requirements for server based database also arises.

In order to analyze the needs of the company about which database would be best, certain points have to clear. Like in how much time will your company data need to be changed and who would make these changes. The concerned authority who would be in-charge of using database and the work they would perform should be known. Also the person responsible for maintenance for data and the source that would be providing IT support should be having knowledge. See to it that which hardware is available and the budget for purchasing the hardware.

After clearing these important issues you can start the procedure of specific database management system evaluation. In order to fulfill database's complex needs, you may require sophisticated server platform that is multi-user based as in Oracle and SQL server. For simpler needs desktop database is the right choice like Microsoft Access. It is inexpensive and offers simple data storage and manipulation facilities. As the name suggests it operates on personal computers and is best for them. Apart from Microsoft Access, Lotus, FileMaker Pro, FoxPro and Paradox are some popular desktop database software.

Oracle, Microsoft SQL server and IBM DB-2 are some popular server databases. They provide enough expertise to manage large amount of data effectively and users can access to these data whenever required. The companies that can afford this particular type of database, they can get the benefit of detailed data management design and solution.

Thus a well structured and planned database design provides a strong base for future company success. Henceforth you can plan accordingly for better performance and futuristic growth.

Reverse Engineering MySQL Database Driven Applications on Windows

Version:1.0 StartHTML:0000000168 EndHTML:0000008367 StartFragment:0000000499 EndFragment:0000008350
Introduction

You've just started a new job, you need to hit the ground running and quickly, but the rest of the team are “too busy” with their work and you don't want to keep pestering your new colleagues or boss with questions. Enter the MySQL Binary Log.

If you have never heard of the Binary Log I suggest at least finding out what it is and what it is usually used for before you continue with the tutorial.
Tools of the trade

O.K. Let's start with the tools required for the job. You will need to download a neat little toolkit called unxutils from Source Forge and make sure it's install path is referenced in your Path. Unxutils is a handy tool which brings many Unix commands to the windows desktop. It especially useful if you commonly switch between Windows and Linux platforms. Once you have done this you need to open a Command Prompt window and test to make sure you can access the commands. Type the following command and hit enter.

ls -l

If you get a directory listing you can happily move on to the next step. Enabling the Binary Log on Windows. The above command was simply to test if unxutils was installed correctly.
The Service

If you search Google for "Enabling MySQL Binary Log on Windows" you will get a stack of results returned some of which will lead you to MySQL bug status results. The reason for this is that since most Windows users will have MySQL installed, using a Windows installer, on their system as a Service and by default Binary Logging is not enabled. If you follow the below instructions to the letter you will be one of few who has Binary Logging enabled on a Windows box.

For this example I am going to install another MySQL Service along side my original one as I do not wish to use Binary Logging all the time. If you already have a MySQL Service running make sure you stop it at this point. If you don't already have the command line open from the first task then fire it up and type the following command.

sc create MySQLBinLogging binPath= ""C:PathToMySQL Serverbinmysqld" --log-bin=yourmachinename MySQL"

After you have run this command you should see a success message at which point you can open the services manager and start the service. On a typical MySQL installation the binary log file will be located in the MySQL data directory on Vista this is typically in “C:ProgramDataMySQL Serverdata” and will be named as you specified with the –log-bin directive when installing the service. The log file is rotated on a size basis so the first log file should have a suffix of 000001 and so on. At this point we are ready to extract the juicy data we have been waiting to get at.
The Data

Again we will be operating from the command line for now so assuming you still have it open change your path to the data directory where the log file is type the following command and hit enter.

tail -f mylogfile

And that is it. If you want to output the data into a text file for further analysis just type the following.

tail -f mylogfile > myoutputfile.txt

NOTE: The binary log only stored update data so you won't get your select data here. You will need to work with the General Query Log for selects.

XML Databases Dynamically Redefine Meta-Databases

More and more implementations of XML related software are beginning to prove that the tree-like structures of XML databases may be the best useful working models of meta-databases with web service connections through web applications.

As you know, metadata is essential for understanding information stored in relational databases through schemas and other descriptors and has become increasingly important in supporting the growing notion that relational databases are falling behind when it comes to wireless networks, sensory arrays and numerous other kinds of complex networks associated with the corporate world of data trafficking.

Since metadata can describe how and when and by whom a particular set of data was collected, and how the data is formatted it can nicely become useful in a tree-like structure with attribute-tags serving as data-about-data.

For many years, databases were implemented as relation structures that kept records of kinds of data that many times lacked the appropriate pre-design and semantic relationships necessary for a normative vocabulary. This approach did not work well with different partially structured data types. But since many other kinds of data where already becoming well established within the specifications of relational type implementations, the concept of a regular corporate database became synonymous with being a relational model.

However, this started to change as the group of people who kept databases became more diverse. The more database designers began to explore the limits of the medium and of the electronic technology that made it possible, the more the boundaries of what could be called a "database" expanded.

Today, mobile data streaming devices may well change the definition of meta-databases entirely by making it possible for database designers to create new kinds of postings through growing web services connections. XML databases give designers a new flexibility in tying multiple kinds of different platforms to work together as one system

Another element of the XML database class of software that is starting to redefine meta-databases is the corporate database. The XML software has opened up a wide variety of web services that are drastically rearranging the IT departments within corporations.

As more companies experience the benefits of using XML datasets as metadata models for controlling relational structures with XML software, the more dynamic will the use of meta-databases become. Between all of these different forces that are constantly expanding and reshaping the XML database sphere of software, the more we will see new semantic tag systems reaching through the self defining nature of XML.

Database-Assisted Marketing Management

Database-assisted marketing management involves choosing target markets that not only get new customers but also retain the existing ones. It is a business subject, which is based on research and study of practical applications of marketing techniques and management of the marketing resources.

Anyone who practices techniques associated with this field could be referred to as a marketing manager with additional skills stemming from the ability to query and analyze database reports that improve marketing vision. The role of this kind of marketing manager is to use this analysis to influence the timing and level of customer demand so as to help the sales process.

This role of marketing manager is actually a virtual function and the ultimate naming convention depends on the size of the business and the type of product or service environment we are focusing on. If the work takes place in a huge production company there could be many general managers each assigned a particular product or product category. In this case, the manager would be responsible for database reporting analysis of the product or category and its profit and loss.

In a smaller business environment the marketing manager with database query and reporting credentials of a particular product or service may very well be the owner or partners of the company.

Creating and communicating best customer values can increase the number of customers. The steps taken and resources utilized to maintain existing customers and get new customers fall under database-assisted marketing management. The scope is quite large because it not only consists of developing a product, but also retaining it. The additional burden includes the constant renewal of proven sales and marketing concepts coupled with database query and data analysis necessary to sustain a growing business.

The term database-assisted marketing management has many definitions. The concept actually depends on individual firms and how the marketing department functions and activities of other departments like operations finance, pricing and sales. But our concerns rely mainly on the ability to obtain accurate sales and marketing data and when depending on a database system to deliver this data to us, we naturally turn to database verification, calibration and validation of its inputs and output processes.

Before deciding about a marketing strategy, the company must do an in-depth study about their business, and the market. This is where database-assisted marketing management merges with strategic planning. Usually the marketing strategies are of three types, customer analysis, company analysis and competitor analysis. Using the customer analysis, the market is broken down into different types of customers.

The database-assisted marketing manager realizes the characteristics and other variables of each group. They are geographical location, demographic, customer behavior pattern and need. Groups of people might be recognized as to who can be less price sensitive with growing purchases. Such groups can be worked on by heavy investments as they are worth the money and time. They cannot only retain such customers and make new customers in this group but they can go to the very extent of turning back customers who don’t belong to this group.
Understanding the needs makes customer’s expectations to be met per their satisfaction, better than the competitors, which will lead to higher sales and obvious profit.

Company analysis highlights the cost structure and resources of the company and cost position when compared to competitors. The accounting executives use it to learn about the profit earned by a particular product. From time to time, audits are conducted to study about the strengths of various brands of the company. Here again the functions derived from software database validation audits strengthens the reach of data reliability.

Marketers using competitor analysis build detail customer profiles using validated database queries and reporting techniques. It gives a clear picture about the strengths and weaknesses of the firm, when compared to a competitor. The competitor’s cost structure, resources, competitive positioning, degree of vertical integration, product differentiation, and profits are studied in detail and are compared to what each company is doing in those regards.

The database-assisted marketing analysis introduces additional query and reporting phases that help carry out the marketing research. The most common types of these researches are qualitative marketing research, quantitative marketing research, experimental techniques and observational techniques. All of these research marketing phases are enhanced with the employment of well known database validation processes that are approved by the company

Because of all the studies and researches conducted through the use of a software validated database, it is easier for the marketing manager to make strategic decisions and they then can design a marketing strategy to increase the profits and revenues of their company. This approach of database assistance extends the reach into profits over the long run, an increase in market share and revenue growth.

MS Access – An under utilised and under rated business tool

Lots of people own it, some tinker with it, a few are experts in it and most IT departments hate it.

Microsoft Access has been a part of the Office family for many years and has gradually evolved along with its siblings; Word, Excel, PowerPoint and the like. In many office environments, people use it as a glorified spreadsheet package enabling far more simple solutions than Excel’s “vlookup” type scenarios and catering for much larger sets of data, but this is very much where the problem lies. Ask anyone in your IT department and they will say it’s rubbish as a database. It’s not rubbish, it is just that IT departments cannot support an application that has not been developed by themselves and more often than not has not been developed by a professional using standard and accepted conventions.

So you end up with a half baked product, developed by someone internally who has “a flair for these things”, the IT department won’t get involved and the originator has long gone. Little wonder MS Access gets the press that it does!

The truth is, developed properly; applications in MS Access can be powerful and extremely cost effective. Okay, if you have more than 20 users and are processing millions of records it won’t be powerful enough for your needs but most companies aren’t processing those volumes and don’t have that many employees, let alone users.

Let’s look at the positives;

There are absolutely no licenses required (apart from the license you get when you purchase MS Office)

1. Development costs are a fraction of that of client server based applications like SQL or Oracle (This is not just down to the programmers wages. When you commission a system using these platforms you also end up employing business analysts, project leaders, accountants and all sorts of other people making up the “development team”).
2. Development time is very quick by comparison.
3. It is extremely flexible and can be altered or added to without fuss.
4. Because it is part of MS Office it interacts seamlessly with it’s siblings
5. It can easily be upgraded to SQL if your volumes do start to overwhelm it.

On the not so positive side, it is not designed for online use. You can set-up forms that allow access to the application through a webpage but the database engine really isn’t powerful enough to cope in this scenario.

So, if you need online functionality, MS Access is probably not for you. But if you have any type of administrative functions that are repetitive and time consuming, a well designed Access application will save you a fortune in time and money.

Employ an established developer to design and build your application and you’ll see that Access can automate a multitude of business processes, make sure they offer good “after sales” and support services and I promise you will not look back

Smart card security

I want to introduct something about SF150 Fluorine
SF150 Fluorine Silicon Drilling Fluids Thinner (powder) In order to satisfy the need of drilling, we research and develop high-temperature drilling liquid thinner, which is convenient to carriage and usage. It is a kind of black mobile solid powder which has the excellent nature of lowing stick and function of fastening wall against collapse. It also has function of disperser, lubrication, and antifoaming to the drilling fluid system, and control hydrating shale and improve mudding function of drilling fluid. It’s maintaining time last long after dealing the mud. This kind of product is green and environmental Protection with non-toxic, and non-pollution. 1. Performance index Item Index Appearance Black mobile solid powder PH value 8~11 Wet value ?15 Stick falling rate % 90 ± 2 oC quantity addition 1% (percent by quality) ?80 120 ± 2 oC quantity ageing addition, 2.0% ( percent by quality) ?75 2. Application scope and method This product mainly apply to All kinds o


No security in this world is perfect. Security is always about the cost of implementing and enforcing the security versus the cost of the fraudster to break the security and the cost to be paid for the security breach. Depending on the type of the system, the cost may or may not be limited to just monetary cost but include the reputation damages, live lost but also national security. Certainly if human live and national security is concerned then the system cost may not be the most important consideration factor. However for many commercial smart card applications, it must be able to bring real benefits to all parties the application operator, cardholders and other service providers in the application. It must be able to help to make more money, lower cost and / or bring benefits to all parties in the application. Last but not least, smart card technology must be the most cost effective technology to be used.
Security in smart card applications
In any smart card application, a basic requirement is that it must be secured. The axiom for smart card application security is that the cost of enforcing the security must be much smaller than the effort to break the security and the cost to break the security must be higher than the potential rewards that can be obtained after breaking the security. In an efficiently designed smart cardapplication, the cost of the system implementation must be minimized but the security not compromised. A smart cardapplication is a distributed system. It compromises of a number of subsystems with each subsystem doing part of the system function. . The security of the system comprises of the front-end enforcement, frontend and back-end verification, backend audit and system fraud damage control. The system design always assumed a scenario of a fraud and how it can be controlled. Security must be addressed at the system level and not to be focused on just a small subsystem or component in the system. It can still make sense to use a lower cost card which is seemingly less secured in a smart card application but it does not mean that security will be compromised. Security can still be implemented in other subsystems so that the application remains secured and meanwhile lowering the entire system cost. To illustrate my argument, let's take a real application example of using the I2C free access memory card (e.g. 24C02) as a prepaid electricity meter card! This card is freely read and update and is therefore easily cloned. However a cloned card will not be accepted by the system. This is done as follows: The prepaid electricity meter is installed in the house and that it shall only accept one or a limited number of prepaid cards, using 24C02 free access memory card. A prepaid card is only acceptable to the meter if the card is authorized by the authority staff card for the first time, which his authorization system will ensure that the card is not being registered with another meter. The staff card can be inserted into the meter, followed by the prepaid card so that the card is registered by the meter. The card contains a serial number, a transaction counter and a stored value. These data is MAC-ed so that it cannot be tampered with being detected. During topping up of value into the meter, the meter shall first verify that the card serial number is registered in the meter and that the transaction counter is one higher than the previous transaction. Also the MAC indicates that the data has not been tampered. Then the transaction counter inside the meter is incremented and value is transferred to the meter. The fraudster will not know the MAC of the next transaction counter since he does not have the MAC key. He needs to pay for the topup before he becomes aware of the MAC. He cannot tamper with the stored value as it will be detected. A cloned card cannot be used as the meter expects the counter to be incremented in the card! This illustrate that even with a free access memory card, security can be achieved if the system security design is done correctly.
Is smart card really secure?
There has been some publicity in the smart card industry about how Mifare card can be cloned. People without good understanding about smart card application security design tend to focus only on the smart card and its algorithm. . Cryptographic algorithm in memory smart card such as Mifare certainly cannot be compared to CPU smart card. Competent smart card application designer using such type of card must implement other security measures in the card mapping design and in other subsystems to strengthen the application security. On the other hand, an incompetent designer, even if he uses the most secured card in the...(and so on) To get More information , you can visit some products about digital special multimeter, school supply counter, . The SF150 Fluorine Silicon Drilling Fluids Thinner (powder) products should be show more here!

Is All Your Data Slowing You Down

For some of you, you have been using the same accounting system for awhile. The good news is that you have years of historical data to draw upon for everything from measuring your company to negotiating vendor contracts to finding the product that Bob purchased five years ago. The bad news is you have to wade through years of transactions to get to the current data. A lot of data can slow down not only your searches for information, but overall system performance.

Wouldn't it be great if you could have the best of both worlds?

You can.

Consider this: Other than trending reports and the rare ad hoc lookups, you use only the current fiscal year and the prior year or two for most of your day to day functions.

An option for handling this situation is to make a copy of the current database for your company and use the copy as an "archive" company.

* Since this "archive" company will be a real accounting system company, it can be accessed as needed. You will simply log into this "archive" company and look for the data you need.
* The security for this company can be set as read-only so users will not be able to change any data.
* Software tools such as FRx, SQL Server Reporting Services, and Crystal Reports can make reports that combine the data for seamless reporting.

Once you have all your data archived, we are free to remove history in your current company using the accounting system's transaction history removal tools.

Next Steps

Creating an "archive" company requires significant accounting system and SQL Server knowledge and experience. Omnios professional consultants can assist you with planning the archival and implementing the change.

Giving Visual FoxPro a new lease life

I read an article on the Internet about what can be done by both Microsoft Corporation and Visual FoxPro programmers to give Visual FoxPro a new lease of life! For example, that article suggested doing away with the Fox icon at the top of the Visual FoxPro application window, reengineering parts of the FoxPro application including doing away with the Windows 95 dialog boxes and so on. This article seeks to contribute to the debate, suggesting additional ideas that the author considers to be critical to not only reviving VFP but also in attracting new developers to VFP cause. The author believes that a certification program (now discontinued for the VFP track my Microsoft) along with a ‘Built with VFP Standard’ Logo are critical to putting VFP back on the map.

Introduction

It is no longer news that Microsoft has announced that they will be no VFP10! Microsoft has since announced that aside from support through 2014, they will be no active marketing of VFP and Microsoft’s actions speak louder than words! In the interim, many new products and white papers have surfaced, advising VFP programmers on how to make the transition to the .NET Framework as well as products that promise to let you code your application in VFP while compiling to ‘IL’ – the primary executable language that enables .NET programs run.

Many Visual FoxPro Programmers have asked themselves what this means and what this will entail! I believe that events have already answered this question! Even before Microsoft announced their intention to stop making VFP, eligible members of the VFP community had already started building enhancements to VFP with the primary goal of making VFP competitive with other contemporary programming languages. These efforts crystallized themselves as VFPx and VFPy on Codeplex, eventually forcing Microsoft to publish their own efforts (Sedna) also on CodePlex!

Therefore it was with great interest that I read the many opinions and ideas that VFP Developers have on how to improve VFP. While I will not dispute these ideas, I just want to add to them!

I believe that Microsoft’s strategy to ‘kill’ VFP is simple – if VFP Programmers are give an ultimatum to change to .NET, and marketing, support and further development/enhancement of the VFP product line is discontinued then VFP programmers will have no choice but to change over to VB or C# (i.e the .NET Framework). But as programmers, we must ask critical questions! What are the critical performance benefits of non-.NET Framework applications as opposed to .NET Framework applications? Since the .NET Framework was touted as a platform independent solution, perhaps comparable to Sun Java’s Byte code, on what additional OS’s or platforms apart from Window is the .NET Framework running since inception?

These questions enable VFP Programmers to realize that there is nothing there are missing on the .NET Framework! (Don’t get me wrong! The .NET Framework is a great effort by Microsoft with some truly great features and promises)! This means that since Microsoft has agreed to at least open-source the VFP Environment, it will be up to the VFP community to keep the flame alive, through advertisements. What form might such advertisements take?

What can be done?

One form would be through the publication of new case studies for Visual FoxPro based projects. The Web provides a ready medium for this purpose. Another way will be to produce a new certification exam for Visual FoxPro programmers to replace the Certification exams now dropped my Microsoft Corporation for the VFP product line. To understand why this is important, consider for example, that if you were delivering an Oracle product, it would do you good to be an OCP or ODBA. Similarly, Microsoft retains certification for its other products apart from VFP. The VFP community could therefore setup new certification exams (considering also that in the developing world, this may be the only qualification available to VFP Programmers who are non-degree holders) that VFP Programmers could take. The aim of the certification exam would be to ascertain that a programmer is qualified to undertake and deliver solutions built primary with VFP as the primary development language.

Such certification exams could be structured to deliver credits earned in three levels viz: Certified Visual FoxPro Professional (CVFP), Certified Visual FoxPro Master (CVPM) and Certified Visual FoxPro Enterprise Architect (CVFPA).

The aim of each certification could be of course decided at a later stage, but for example, the Certified Visual FoxPro Professional examination could aim to ensure that a Visual FoxPro programmer can write workable VFP based applications and desktop applications using VFP. The CVPM certification could be used to certify that a VFP Programmer could build full-featured pure fox two-tier client/server applications using VFP as the primary application development environment. The CVFPA certification would test a programmers knowledge on how to use to VFP to build top-notch n-tier distributed web-based or windows based enterprise applications.

These certifications would be designed to give VFP Programmers something to hold onto and also publicized attract new programmers into the VFP fold. The trend we have observed in our part of the world is the importance placed on professional certifications of this type as testimony that the professional does indeed possess the requisite professional skills to deliver a solution based on the stated/proposed technology. We have also observed that most people look upon the availability of certification exams as testimony that the technology is current. Why should this be different in the case of Visual FoxPro? While MSCD on the Visual FoxPro track existed, Microsoft ensured that those for Visual Basic and Visual C++ overshadowed these! Now VFP Programmers can setup something on their own and ensure that it is properly marketed to developers!

Of course setting up certification exams would have to come with the full works such as setting up curricular, courseware and approved training materials that students could use to train. Such materials could also form the nucleus of another marketing ploy….marketing VFP as the principal tool to teach Database concepts at educational institutions and making sure people know that VFP educational licenses are available and very affordable.

Yet again another certification and therefore marketing ploy would be to certify VFP product sites as ‘Powered By VFP’ or ‘VFP Enterprise Approved’ or whatever you would want to call it. This would allow organizations that buy solutions built with VFP to be assured that they have received an enterprise solution engineered to the highest standards much as Lucent used to Certify networks sites for their ‘Systimax’ Structured Cabling standard in those days. Of course anyone who has been around would easily know that other cabling standards such as ‘Belkin Structured Cabling’ existed but who was setting the pace and standards and why?

Who would administer such a program?

Efforts such as VFPx and VFPy have already demonstrated the power that a community will can bring to bear. The same spirit can be brought to bear on the ideas contained herein. Perhaps, I will make bold to say that the VFPx Team can be used as a ‘rallying point’ to nominate imminent members of the VFP community with a sound knowledge of the VFP Product technology line to administer a VFP University Product foundation that could then oversee or implement the ideas contained herein, perhaps of course taking care to structure the organization in such a way that it does not violate Microsoft’s trademarks or copyrights but giving a new lease of life to existing VFP programmers while attracting new talent to take-up building solutions with VFP!

Since some of this programmers required to administer this idea would be working professionals working to make a living, the whose thing could be structured still around the idea of VFPx, with a core of professionals coordinating everything, making sure that certifications are properly gazette with both Microsoft and other professional bodies, while allowing VFP community members to contribute their ideas and thoughts on the curricular and other aspects of the organization to ensure that all certifications actually reflect trainings on current industry standard practice.

The VFP University Product Foundation team could:

1. Draw up curricular for VFP Certification examinations and promote such certification examinations world-wide through usual VFP Community site and other amenable technology sites
2. Publish Training materials that could be used for such trainings and certification study programs
3. Ensure that certifications issued are properly accreddited with both Microsoft Corporation and other industry standard bodies (whoever is taking this should know that it is worthwhile).
4. Appoint approved training centers world-wide that could offer such trainings
5. Provide a means for such trainings to be taken online (much as with other CBT offerings online)
6. Certify and rate project sites implemented with Visual FoxPro to ensure that these meet industry standard practice and upon certification, issue an ‘Powered by VFP’ Logo/seal as a mark of quality
7. Set and Publish standards for VFP Programmers
8. Provide a forum for publication of new case-studies for VFP Programmers and ensure that these are also published in other technology forums (not just the usual VFP ones).
9. Promote VFP to educational institutions and foundation to use as a primary tool in demonstrating Database concepts and teaching programming concepts
10. etc

The VFP University Foundation would just be just like Oracle University or Microsoft University! Providing critical lifeline of continuing education on the VFP product line; doing what Microsoft is not now doing for VFP, just as Microsoft University does for other Microsoft products and Oracle does for the Oracle product portfolio!

It takes Money! Where will the Money Come from?

Definitely, administering an initiative of this sort takes money and financial resources to setup web sites, produce and publish training materials, place adverts, print and send certificates, travel, etc but this need not be an obstacle! if programmers pay a fee to take the exam and are required to take a new exam say every two-three years as new features are integrated into the VFP product by voluntary community efforts, funds will be available. Additionally, certifying sites, speaking at community events and son on would be done for a token fee that would bring resources in to not only run the foundation but also to provide a token compensation for the core of people coordinating the activities of the university foundation. A seed fund would definitely need to be raised by voluntary donations to kick-start the whole effort.

What it all boils down to!

All of the ideas itemized above will not of course obviate the need for improvements and modernization of the VFP product nor do we as a community need to take Microsoft’s place to advertize a product that it makes. This means that as long as the product remains largely close-sourced, many of the things will depend not on the community but on Microsoft! For example, taking off those Windows 95 dialogs and replacing them with new dialogs and so on. However, because we are the ones that use this product and some of us have too huge an investment in this product to start re-writing in another language, we have to take some action to ensure that the VFP product line is not just casually swept under the carpet like that…we do not have to accept a faith accompli!

This means that the campaign to get Microsoft to honor its ‘Corporate-community’ responsibility to the many VFP programmers who make a living by writing software with VFP, to open-source all of VFP (not just its environment) must be intensified. Now is as good a time as any for the VFP community to take a definite stance on their beloved venerable development tool of choice!

VFP has a reputation of having a vibrant active, dedicated and loyal user community; something that is the envy of most other products, including Microsoft’s Visual Basic or C#! Now is the time for this community to show its true mettle, taking the VFP Bull by the horns and doing what must be done to keep the product alive! It will require will on our part as a community and we need not be the underdog always! ‘Yes we can’

Thursday, May 28, 2009

MS Access versus Client Server Database Platforms

Throughout my years working with Access, initially using it as a glorified spreadsheet package, I have gradually honed my skills but I am still learning new things everyday. Every new project takes my knowledge to a new level and I want the business world to realise just how versatile this product really is.

Having built systems as complex as Reinsurance pool ledgers I know just how powerful it can be. It is a shame that the IT fraternity have always mocked its capabilities as a reliable database application. This is mainly to do with the fact that users can build their own systems, which generally don’t follow standard programming conventions and it becomes, from their point of view, impossible to support. However, designed and programmed properly this needn’t be the case.

OK so you can’t build conglomerate size systems with Access, I accept that. But there are literally millions of companies in the world that aren’t conglomerates.

I have worked for Insurance companies that have invested millions of pounds in their systems which is obviously a completely different ball game, but for the small to medium sized business MS Access is invaluable.

Access can produce an extremely professional, user friendly interface or menu based functionality that easily rivals the bigger fish. Indeed, Access interfaces are often used to front client server systems such as SQL Server due to Access’s flexibility and ease of menu design. The only difference is the data is held on the SQL platform instead of internally in the Access platform. Obviously the volume of data here is what counts, but let me tell you, I have a database with over three million records in it and it works fine. It has only taken up 3/4’s of the memory space that Access can handle, so as you can see, for most companies this is more than sufficient.

Let’s talk about cost. For the price of developing a system in Access, you probably would have spent the same just getting the system spec’d out in Oracle. This is because the companies that develop these client server mainframe systems have huge infrastructures and as a consequence, huge overheads. They will charge a minimum of £100ph for every individual at every meeting they have with you, then there is the project manager, the accountant, the programmers and so on.

You then need powerful hardware to handle such a system, along with expensive licenses for every user. Access needs none of this! Any reasonable server can handle it or, if you are a small outfit, a desktop computer is fine. The only license you will ever need is the license you got when you purchased MS Access itself (most people already have it as it comes with MS Office).

So you are just left with the cost of developing it. Well most Access programmers either own or work for small establishments and simply don’t have the same massive overheads.

If you need a bespoke system, please consider the merits of using Access, it may not be for you but then again it just might. Find yourself an established company, with good support mechanisms, and you might just save yourself a fortune!

Salvage time along with currency with Dial-A-Tech

Most people of the fresh age regard time as equal to so. So, no one would endure their computer being idle on account of various tribulations in the hardware or software. Computers are definitely a vital element of our life, in the modern day along with age because the achievement of our enterprise depends upon them. Along with, a little stoppage in our computer can cause an intense failure. However, these failures could be prevented with correct upholding as well as defensive measures. Even so, now you can easily avail onsite
be augmented, due to the apposite computer support Sydney facilities from proficient technicians. The onsite computer repair Sydney facilities would cover all the central facades of your computer. The deletion and prevention of the virus in addition to spyware would be done by the technician along with the various new putting in along with data backup.
Now, it has become very effortless to avail the advantages of the online computer repairs Sydney. If there is any fault in your computer, it stops working, or does not operate appropriately, and then you can easily contact an organization that donates the onsite computer repairs services. Dial-A-Tech is the best alternative for you, if you are looking for a company that supplies computer support Sydney. They are total professionals in solving all the problems associated with the computer. They bestow their amenities in all components of Sydney. The utter professionals operating at Dial-A-Tech exert all days, round the year. In the specialty of onsite computer repairs Sydney, Dial-A-Tech is a most unfailing name. You can avail all classes of computer support Sydney for your computer that is employed at your dwelling as well as your office. Those computers that are impinged on due to spyware, adware and virus, cold be effortlessly treated by the professionals of Dial-A-Tech. This would augment the operation of your computer and it would work faster than before. Dial-A-Tech also donates services like laptop repairs Sydney, data recovery Sydney, computer support Sydney, remote backing and search engine optimization , excluding the mobile computer repairs Sydney services.
All your disturbances linked with your computer would definitely come to closing stages after contacting Dial-A-Tech for your onsite computer repairs. This would hoard your time along with cash for transporting the computer to the repair posting. You can straightforwardly call Dial-A-Tech for your help along with they would send their accomplished technician to sort out the harms in your computer. Enormously reasonable charge is charged by Dial-A-Tech from their clientele, for all their amenities.

Tuesday, January 20, 2009

Recovering Your Pc From Disaster With Vista Backup & Restore,Databases Category

Recovering Your Pc From Disaster With Vista Backup & Restore

Whether you're a business or individual it's more important than ever to be able to recover from disaster as quickly as possible and with minimum data loss. However it's surprising just how many people do so little about it. This article outlines this useful service in Windwos Vista and how to get the best out of it so you can feel secure in the knowledge recovering from disaster is simplicity itself.
A pre-requisite to using this feature of Vista is buying the license for either Vista Business or Ultimate editions. Unfortunately Microsoft have not made it available in other versions of Vista. There are two types of backup you can use, and they affect how you restore the system:
1. Using the 'Backup Computer' it's possible to write what's called an image backup of your entire PC to a backup device usually a USB stick or external drive. This snapshot of your entire system allows you to recover exactly to the same point in time. However, because it is an 'image' of your system drive it requires the restoring PC to have a system drive of at least the same capacity. Usually it would be the same PC so this is not a problem. It is also the lengthier of the two backup options as it backs up the whole machine.
2. Using 'Backup Files' you can backup selected data files such as images, photos, music, documents, spreadsheets, emails and application data. This type of backup is incremental i.e. it only backs up changed files and therefore is usually very fast taking only a few minutes. However it will not enable you to restore a full system, only those backed up files.

Typically we advise you use both types of backup and blend them to cover all your data and system. So for example you might take a complete backup weekly, but a file backup daily. In the event of a disaster you would need to restore the system backup first, and then each daily backup (since its incremental) that contains changes that will not be on the full system backup.
The service has a scheduling function within it so that regular file or full image backups can be taken on a regular basis be that daily, weekly or monthly, whatever suits your needs. With the speed of modern drives the backup of your entire PC can be done within an hour and in the event of a disaster be recovered in a similar amount of time.
There are many options for backup devices but with the steadily decreasing cost of external hard drives and USB memory sticks these are looking like the consumer and very small business users backup device of choice. You need a minimum of 80GB of space to backup up your entire system and that's what I'd recommend you do. An external hard drive would normally be connected to your PC via a free USB2 socket, however in some instances you may also have eSATA, Ethernet or Firewire (IEEE1394) connections available as alternatives. The fastest connections to use for backup in reverse order are USB2, Firewire and eSATA.
Should your machine stop working for either software or hardware related reasons the Image backup allows you to return it to its last working state to and be confident that everything should work perfectly as it did previously. What this means is that should anything go wrong with your current installation like a faulty hard drive, corrupt windows file or Virus problems, all that is needed to be done is to enter the Windows Vista Ultimate or Business disc, plug in the backup device and start the PC. Once you have done so, follow the steps below to fully restore your PC to the exact state as it was at the last backup.
Keeping your back-up image up to date means you don't have to worry about any problems caused by software errors, virus infections, hardware failures or corruption. Similarly, should your system hard disk fail entirely, all that would need to be done would be to get a replacement, install it and then follow the process of a system restore and you would have your operating system and software back as it was before.
Please note system restore points described in Windows are a different tool. They save operating system files as a snapshot at a given time in another directory (usually hidden) on your system disk so that without referring to a backup you can reverse a hot fix, service pack or driver update should it prove to cause instability or bugs. This is obviously no use to you is the system disk fails or becomes corrupted.

Doesn't RAID protect me from drive failures though?
Ideally if you have also selected our RAID1 or RAID5 disk storage configuration you can afford to lose one hard drive and simply 'hot swap' it out for a replacement without any interruption to the working system as it is able to rebuild the information that was on the missing drive on the fly, until you are able to replace the defective drive. Once the defective drive is replaced the RAID5 redundant volume set is rebuilt on the new drive and you have fault tolerance restored once again.

Remember your backup images don't care whether it's a RAID drive they are being backed up from or restored to so you can back up a RAID set and restore to non-RAID or vice versa.

How to restore Vista from a backup
Assuming you have your backups now to recover from your backup device (USB or External hard drive), and your machine is now back up and running (or you are using a similar replacement machine) and you have the same capacity hard drive space available it's very simple. Just go through the following steps:
1. Insert your Windows operating system DVD into the drive
2. Reboot your machine (or power on)
3. If you are asked whether to 'Press A Key' to boot from CD-ROM, then do so. The standard windows installation process will now begin with a progress bar along the bottom of your screen. Select your preferred language when prompted (i.e. 'UK English') and click 'Next'.
4. Once the Windows setup DVD has started select the Repair option from the first screen. You will see the 'Repair your computer' option on the bottom left. Click on it and click 'Select'. On The following screen click 'Next' and you will get to a screen with an option o perform a 'Complete PC restore'
5. Then select the option to 'Windows Complete Restore' to restore from your latest backup (ensure your backup device is already connected and switched on)
6. Windows will now look for backups on your device and allow you to restore your system from it, this will take a few minutes depending on how big your system disk was in terms of volume of data that has been backed up
7. Now remove the Windows setup DVD, and reboot your PC again
8. Your system will now be restored to the state of your last Full image backup
9. You can now go into Control Panel, System Maintenance, Backup and Restore, and select to restore any documents, images, music, email or data files that are backed up individually but were not in the last Full image backup

Please note running through this process WILL DESTROY any existing data on the disks as it completely overwrites it with your backup image including any new files that might be on the disk so you need to be sure you do want to restore from the backup.
Typically a full backup image will take up at least 10GB of space, and your restore will create a new volume of exactly the same size as your original volume so you must ensure that you are restoring to a volume at least as big as your backup volume was.

Backup and Disaster Recovery FAQ
What do I need to do to ensure backups are run?
Just make sure you machine is on at the time set for backups to run, the backup device is connected and switched on. We also recommend you make sure all applications are closed and its advisable to have nothing running while backups are being taken.

Manually running backups
If you would like to run a one off backup because you are about to change your system configuration or add hardware or software and you want to be sure the system is safe this is a prudent practice and is easy to do. Go to the 'Start' bar (which has now been replaced with a Windows icon in the case of Vista) and Click on Control Panel, followed by System Maintenance and then Backup and Restore Centre. Select 'Backup computer' to make a full image backup of the entire machine and its configuration, or select 'Backup files' to make incremental backups of changed data files (the latter being much quicker and requiring far less backup space).

Bear in mind that you need to rename your backup image file names manually so you can identify which one is which when it comes to having to restore them, otherwise each successive backup will simply overwrite the last and you cant recover to a given point in time. To change their folder names simply use file explorer as you would on any other windows file system device.

All backups are stored in a folder on the backup device named 'Windows image backup'. The default name for the folder will be the PC's computer name. Here, simply right click the folder with the new backup you have made (the new one will be identifiable by the date) and select 'rename'. If you then need to restore to any given point you will be given a choice of folder names identifying all your backups you can restore from.

Do I need to backup anything else?
I recommend you also periodically (say once a month or quarter) backup your full system with a complete image backup. This is the only way to ensure the whole system in its complete working state is backed up and can be restored to. If you don't mind the wait for it to finish its better to run a Full backup as often as possible, but it can take some time (several hours).

Can I restore my system even if its still working?
Yes! If you want to revert to a previous working state of your system at the time of an earlier backup then even though your system is working you can go through the same procedure above to restore it to that earlier point. This is sometimes useful if a software or driver installation or a virus has infected your machine and you would like to safely revert to an earlier known working and secure state.

If the Windows DVD/CD wont boot on my machine what should I do?
Most likely your CD/DVD drive isn't set as the first priority boot device so the PC is scanning the disks and trying to boot from them first and failing. You need to check your boot priority setup in the machines BIOS setup. You can usually enter the BIOS by pressing [Del] at start-up (PC start-up, not Windows start-up, if windows is starting up its too late), usually there is a screen to accompany this to tell you when to press [Del] or another prescribed key). Your BIOS will usually give you three or more options for priority order of boot devices for installing windows or recovering from a backup this needs to be set as [CD-ROM] or [USB] first, it doesn't matter what follows it in second priority but ideally it should be your system disk to minimise boot-up time. This is exactly the same process as if you were about to fresh install Windows onto the machine

Protecting backup media
Don't forget your backup is only as good as the medium it's on. In business its good practice to dummy run a restore of your system every now and again to make sure the process and the media are working properly. If you have the time id recommend you do the same. Its also advisable to protect the media especially if all your backups are on a single external drive or USB device. Consider storing the device in a fire safe in between backups to ensure it doesn't get destroyed in the event of a building fire, flood or collapse. To be doubly sure buy two backup devices and rotate them on and off site (at a friends house or different business location) that way if an aeroplane hits your house (hopefully while you are out!) your friend still has one of your backups for you…