Data Consolidation Techniques

As data increases in the company’s database, the need for data to be consolidated is a must in order to manage it effectively and utilize it for business operations. Data consolidation is getting data from multiple locations and sources and integrating them into a single database to be used in the company. Consolidation is an important component in data integration modules that comprise data propagation and federation.

Data propagation deals with duplicating information from different sources and locations while data federation deals with unifying the source information virtually. When data is integrated into a single database it allows for quicker access and better control. Managing data is now more effective and efficient. Data consolidation is done with the use of two different technologies and these are the ELT and ETL.

ELT stands for Extract, Load, and Transform. This is where the systems transform a volume of data after it is loaded in a database. After the loading process is done, it is then transformed and then delivered to different tables that can be access by authorized users. This technology is also called pull systems because it is performed on-demand by any individual. This allows also the users to transform and publish data after it is loaded in the database.

On the other hand, ETL stands for Extract, Transform, and Load. This is another data consolidation technique where it extracts information from multiple resources, transforms it into the standard rules and then loads it afterwards in the target systems with specified formats. It is quite different from ELT, because data is being transformed first before loading process takes place. Transformation takes place in the form of reformatting, standardizing and streamlining it to other data manipulation rules set by the company.

The extraction process is the first stage in any data consolidation techniques. Extraction may take place from high volume to multiple data sources or maybe from relational to object databases and other documents. This may also delivers both unstructured and structured data. The next technique is the transformation process that varies from data consolidation technique that is available. This may also ranges from single to complex operations. This allows also to deliver timely and relevant information that are used by the management team in their decision making process. Data is customized and tailored to what the company really needs. And the last process is the loading where it transfers and delivers data from one location to any target application. The loading process differs in both techniques because in ELT data loaded is unprocessed while in ETL data is loaded after it is processed.

Data consolidation is done with two different techniques. However, both of these techniques aim to integrate all the necessary data and information from different sources to a single database for effective management of data.

Playstation 3 Yellow Light of Death Fix – How Can You Fix PS3 YLOD All by Yourself?

The PlayStation 3 yellow light of death, YLOD, is very appropriately named! When you go to turn on your system and watch that little yellow light flip to a blinking red light it is very frustrating. You press the button over and over trying to get the system to boot, but it does no good. While many people assume their console has really died and there is nothing to do but replace it, this is absolutely untrue.

It would certainly be easier and faster to just throw away the system out and purchase a new one, but anyone who does that would be stupid enough because the PS3 Yellow Light Of Death is a fixable problem! Plus, it is very likely that the new console will just develop the same PlayStation 3 yellow light problem, and you can’t continually replace it over and over.

That is why game enthusiasts are now turning to the internet to find fast, convenient, and affordable fixes for the Ps3 yellow light issue. Before you determine which of these fixes is right for you, take a moment to consider what may be causing the problem to begin with.

In most cases, the flashing yellow light of death of PS3 is caused by internal failures due to overheating. Many users notice that their console sounds much louder than usual and/or feels very warm to the touch prior to the PlayStation 3 yellow light problem developing. These are actually good signs that the system is overheating and if you can fix the problem right away the flashing lights may actually be avoided!

For future reference, the PS3 console needs adequate ventilation to prevent overheating, or at least make it less likely to happen. Many users place their console down flat on the ground or up against a wall, which blocks off the fan vent which is designed to cool the system down. Other users do not realize that they need to clean off dust so it doesn’t collect in the vent and excessive dust can also lead to overheating as it blocks the free flow of air in and out of the unit.

Once you have cleaned dust off of your console and placed it in an area where it is not obstructed, it’s time to try a couple quick fixes:

  • Unplug the console from the wall for half an hour, then plug it back in.
  • Remove the hard drive and replace.
  • Check all wires and connections to the console.

If you can get a hard drive from someone else which you know works, you will determine right away if you can replace the hard drive to fix yellow light of death problem. In most cases, there will be more substantial damage internally which will require you to either send the console in to Sony or learn how to fix the PlayStation 3 yellow light blinking problem on your own.

Doing it on your own is much faster and will cost you less. Some online programs offer detailed video instructions that walk you through it in less than an hour and some will even offer some extra bonuses that make your system more functional once the problem is fixed.

The only reason to ever pay the shipping to Sony is if you have a valid warranty that will pay for the repairs completely. Otherwise, you can do it cheaper and much, much faster by using a good online program with video instruction.

A digital marketing task by Kelvin Scotts, top singapore digital agency that provides complete social media in singapore services. More scope, visit here.

The History Of Data Deduplication

Data deduplication has been around, at least in its most primitive form, since the 1970’s. It initially started because companies wanted to store large amount of customer contact information without using a large amount of storage space. One of the first ideas was to go through and remove duplicate data. For example, a company might have an address for shipping, and an address for billing to a given customer. In these cases, those identical addresses would be combined into one file. This was done by data entry clerks who would review the data line by line and get rid of duplicates.

Of course, the amount of personnel needed to do this was extensive and it took a very long time. Sometimes, the data deduplication process would take months to complete. However, considering that most of this occurred on hard copy, it wasn’t a major problem. The big problems cam along when computer use became widespread in office environments.

With computers in wide use and the explosion of the internet, the amount of data available exploded as well. Backup systems were created to ensure that companies would not lose all their data. As time went by, floppy discs and other external hardware was used to store this data. Unfortunately, this data would soon fill up these discs and the amount of space to store this data was extensive.

With cloud storage and other alternative storage options, companies began moving their storage to a virtual environment. They also moved to disk-based storage over tape based, simply because it was low-cost and required less space. However, these storage options were expensive and difficult to manage as data grew. The same data would get saved over and over again. This redundant data was not needed and took up valuable storage space.

Companies might have customized their backup plans to eliminate duplication, but there was no fast way to do this. That is when IT professionals began working on algorithms to automate the data deduplication process. They generally did this on a cases by case basis, with their goal to optimize their own backup files. Their algorithms would be customized to meet their own individual needs.

There was no one company that came up with the idea of data deduplication. Instead, the need to find a way to reduce duplicate files was a common need in the industry. There were many computer scientists who advanced data deduplication technology significantly, but there is no one scientist who was solely responsible for it. While many have claimed credit for the term ‘data deduplication’ itself, no one person can claim credit for the idea itself.

Instead, the creation of data deduplication algorithms was more of a compilation. People in the IT industry saw a need to reduce data duplicates and they filled that need to reduce those duplicated file by creating algorithms. As data increases, people will continue to find ways to compress data in a way that makes it easy to store.

The 7 Secret Windows Shortcuts You Never Knew About

What if it only takes you a split of a second to initiate a command from fingers? For years, regular users of the Windows computer have been using their mouse more than these cool shortcuts which could very well save them time and effort.

You can be sure that the seasoned tech experts have been using these secretly. And now, you will be shown the way as well. Hang on! Here they come.

1. Windows logo key + L key

Moving away to pantry for a new cup of coffee? Keep curious eyes out of your work with this quick shortcut – you can lock your windows instantly.

2. Shift key + Delete key

Do you find it troublesome to always empty your Recycle Bin after you take out the trash? Now with this quick shortcut, you can bypass the Recycle Bin and get rid of it instantly. But do note of the downside – you won’t be able to retrieve any accidentally deleted files.

3. Alt key / Windows logo key + Tab key

Too many multiple windows opened? This handy shortcut will allow you to select the right screen easily.

4. Shift key + Ctrl key + N key

Want to create a new folder instantly, without having to do the clicking? Now you can. Hit this 3 key combination, and a new folder will appear with the name ‘New folder’ already highlighted – so you can type in your preferred name.

5. Windows key + M key

Uh oh, did you load too many windows that they are cluttering your screen? Or are you sensing that your boss is coming right behind you? Use this quick shortcut to minimize all of the windows instantly. It’s a great way to save your time, and sometimes, your appraisal at the end of the year.

6. Windows logo key + Left or Right Arrow key

If you are using a two monitors for your desktop, you may prefer some windows to appear on your left, and some others on the right monitor. Hitting this quick shortcut allows your window to move from one monitor to the next one. And if you only have a single monitor, this shortcut will re-position your window to the side of the screen.

7. Windows logo key + (+/- key)

Can’t see that super small font? Or would you like to view your smaller? Press windows key and + key together to zoom in for a magnified view. Or you could hit windows key and – to get a zoom out view.

HP Spectre X360 – A Revolutionary Hybrid Laptop

A revolutionary hybrid laptop, the Spectre x360 is quite sleek and shiny with an aluminum design almost similar to a MacBook. Both the lid and the bottom are of aluminum and silver in color. This 2-in-1 measures 14.8 x 9.75 x 0.63 inches. This laptop is not only beautiful, it is also super thin and light making it very easy to carry. A 15-inch screen and very good performance.

Design:

The HP Spectre x360 has been built for endurance and quality. The laptop has a weight of only 1.4 kg. It can easily be carried around on the go. What’s fantastic about this laptop is the hinge. It has been cleverly manufactured and can flip 360 degrees such that the keyboard and screen are back to back effectively turning the laptop into a tablet. When in tablet mode, the keyboard keys are non-responsive so there are no worries about having them touch against the back of the screen. The thinnest of the device makes it also very easy to handle in tablet mode. Another method of using it in tablet mode is to have it placed in an angle with the keyboard acting as a stand. This is fantastic for watching trailers by placing it on top of a desk.

Display:

A 15-inch screen provide very bright images and vivid colors. About 246 nits of brightness more than average, the 2-in-1 will show perfectly outdoors or in direct sunlight. 2560 x 1440-pixel screen resolution. The pixel density is 221 ppi. Multi-touch, everything that a professional would be seeking to be able to undertake the most demanding types of activities. It has a high-end IPS with wide viewing angles. A glossy display is not very outstanding for convertibles but enough to make everything look sharp and nice. The screen has a great response time better than average.

A change that HP has introduced is in the graphic chip. The Intel HD graphic 520 is the successor of the 5500. The 520 allows to play flash games but hardly any intensive games. On the 3D graphics benchmark, it achieves a score of 64,632 which is adequate enough.

Keyboard:

The keyboard feels somewhat shallow. The keyboard is backlit and has a quite nice glow. The keys are full size like most of the laptops and have 1.5 mm of action. When pressed they feel like that of a high-end notebook, they are quite responsive and make sort of a click sound. The touchpad measures 5.5 x 2.6 and feels smooth. It has plenty of room for navigating with the mouse or performing swipe gestures. You would have no problems doing operations such as scrolling, pinching to zoom, etc.

Connectivity:

Included are plenty of ports. The latest version of USB, Type-C, which can also be converted and used as the power supply. On the left side can be found USB 3.0, the headphone/mic jack, an SD card reader while the right side are two other USB 3.0 ports, an HDMI and a mini display port for external monitors. A 1080p webcam can be found on the top of the screen that can be used for taking selfies or for social media. The quality of the pictures is quite good. Colors are bright and images appear sharp but slightly darker than in real life. Missing is an Ethernet port, but this could have been omitted because of the slim design. It has also been designed such as to rely mostly on Wi-Fi for connection to the internet than on wired connection.

With the Band & Olufen Speaker software, the audio is clear and loud.

Performance:

A 2.3 GHz Intel Core i5-6200U central processing unit powers this laptop. It has 8GB of RAM and 256GB of SSD storage capacity other versions can be found with 16GB of RAM. This hybrid has enough performance and storage for performing any type of activity, office and multimedia. HP has provided a choice between two types of processors for the X360 series. The Intel Core i5 or the Intel Core i7-6500U. Both provide fantastic response time allowing for very comfortable work flow. Compared with its predecessors, HP has upgraded the hardware.

The maximum power consumption is between 28.7 -29.9 W which is lower than HP Broadwell model. The battery of HP Spectre x360 has a capacity of 56Wh and would last for about 8hours and 20 minute. It’s not the best between convertibles but considering the fact that it can be used for almost the entire day on the go is convenient.

Software:

The x360 uses Window 10 operating system which is the latest version of windows introduced by Microsoft. HP has introduced a lot of bloatware which if removed can increase storage capacity and performance. The Bang & Olefin Speaker software helps to adjust sound, the HP recovery manager can be used to reinstall drivers and applications and also troubleshoot. Beside those, there are many other apps on the menu some of which such as; Heart radio, Candy Crush soda Saga, Snapfish, Flipboard and others. Many can be uninstalled to increased performance. Microsoft also provide a one-month free trial of Microsoft office.

Pros & Cons:

Regardless, for those who have knowledge of computers, this laptop does not lack its pros can con and one needs to be an expert to actually find something terrible with the x360. If anything making comparison with competitors, it is not the most efficient but average. The hardware and software could be improved particularly with regards to the graphic. The keyboard is great but could use for more security measures like fingerprint or even a stylus could have been included. HP has almost provided quite a great hybrid that can appeal to many consumers. Generally, hybrid is very popular with the consumer as they can be used both as laptop and converted into tablets. The HP Spectre x360 is provided at an average price in the market which for the lack of certain elements such as stylus etc. is quite good.

How to Recover Deleted Text Messages From a Sanyo CDMA Cellular Phone

As popular as cell phones are with teenagers, you would almost think that they were born with them attached to their body. Just think about it, if they happen to get grounded and their phone is taken away, they act as if you have just removed their lifeline. Although most of our teens just use these devices to stay in contact with their close friends and family members, there are times when they can become tempted to engage in activities that they have no business sticking their noses. A few of the more common problems that many parents face is sexting with their girlfriends or boyfriends, and bullying other teens that they have gotten into arguments with. Another big problem that some parents also face is illegal drug use. Each of these are generally problems that you will find evidence of in their cellular devices. This is because these devices are their favorite form of communicating with others and is where most of their conversations take place. If a parent notices any hint that their teenager is engaging in any of these types of situations, it is recommend that you consult with a professional investigator that knows how to recover deleted text messages from a Sanyo CDMA cellular device.

Experts that are knowledgeable in cell phone forensic investigations can easily retrieve data from your teens cellular device. This makes it easier for parents to obtain information they need to see if there are problems that need to be addressed. A PI can recover deleted text, SMS history, NV memory dump, phone book information, GUID properties, device properties, call history, file system, and other data that has been erased from the cellular device.

One mistake that some parents have made that you will want to make yourself aware of, is the use of over-the-counter SIM card readers. Using these readers can end up with more problems, as they are bad about destroying data as the process is being performed, making it impossible to be recovered again. Investigators that perform cell phone forensic investigations use equipment that costs thousands of dollars and will not destroy data they are retrieving for you. Many times even when they go to retrieve deleted text, several different tools have to be tried before they find one that matches up to a Sanyo CDMA device, and matching store-bought counters is close to impossible.

If you are concerned about any of your teens activities, the best investigative tool that can be used is offered by experienced investigators that know what their doing. They can recover deleted text and other data in as little as 48 hours.

Copyright (c) 2010 Ed Opperman

What Is the Difference Between Hot, Warm and Cold Disaster Recovery?

When it comes to implementing your business continuity plan what strategy do you adopt for the disaster recovery element? (for a description on the difference between Disaster Recovery and Business Continuity please see my article on Disaster Recovery or Business Continuity?).

You may have heard the terms hot, cold and warm recovery, but what do they mean, and what are the advantages and disadvantages of each service?

Hot Standby

Hot standby is normally available to the users within minutes of a disaster situation. This level of service is achieved by total duplication of the computer systems covered (hardware, software and data). There will also be a requirement for a resilient network connection into the Hot Site.

Benefits – Available immediately; dedicated to (customer).

Disadvantages – Cost; Complexity, management.

Warm Standby

Warm standby is normally available to the users within hours of a disaster situation. This is by far the most common type of service utilised by for I.T. disaster recovery, and typical recovery times range from 8 hours to 24 hours (dependent on complexity, location and data volumes).

The service can be delivered from a remote recovery centre, or alternatively, delivered to site in the event of a disaster. Depending on the equipment involved the configuration may be installed within an existing facility or a mobile recovery unit.

It should be noted that whilst the Hot standby option is normally dedicated to one customer, Warm standby is delivered on a subscription basis. Industry standards are between ten and twenty five subscribers per configuration. Availability is therefore not guaranteed in the event of a disaster. Testing is also normally to a predefined number of days P.A.

Benefits – Lower cost; reasonable availability.

Disadvantages – Availability; recovery timescales are longer; limited testing available; only available for a limited period following a disaster.

Cold Standby

Cold standby is the provision of computer and people facilities that are made available to the client within a few hours of the incident. Unless the service is backed up by a contract to supply the necessary computer equipment, the recovery period is likely to be several days. It is not unusual for Warm and Cold standby services to be combined, giving a very flexible approach to recovery.

Fully serviced office space is also available on a subscription basis. These are usually equipped with PCs, servers, printing facility and a network infrastructure. These would be described as Business Recovery Centres, and could also incorporate Cold space for central systems.

Benefits – Lower cost; large amount of available space (can accommodate large systems). Business recovery Centres can accommodate several hundred people.

Disadvantages – Availability; recovery timescales are longer; limited testing available; only available for a limited period following a disaster; additional recovery services needed.

EMA Continuity

The History Of Data Deduplication

Data deduplication has been around, at least in its most primitive form, since the 1970’s. It initially started because companies wanted to store large amount of customer contact information without using a large amount of storage space. One of the first ideas was to go through and remove duplicate data. For example, a company might have an address for shipping, and an address for billing to a given customer. In these cases, those identical addresses would be combined into one file. This was done by data entry clerks who would review the data line by line and get rid of duplicates.

Of course, the amount of personnel needed to do this was extensive and it took a very long time. Sometimes, the data deduplication process would take months to complete. However, considering that most of this occurred on hard copy, it wasn’t a major problem. The big problems cam along when computer use became widespread in office environments.

With computers in wide use and the explosion of the internet, the amount of data available exploded as well. Backup systems were created to ensure that companies would not lose all their data. As time went by, floppy discs and other external hardware was used to store this data. Unfortunately, this data would soon fill up these discs and the amount of space to store this data was extensive.

With cloud storage and other alternative storage options, companies began moving their storage to a virtual environment. They also moved to disk-based storage over tape based, simply because it was low-cost and required less space. However, these storage options were expensive and difficult to manage as data grew. The same data would get saved over and over again. This redundant data was not needed and took up valuable storage space.

Companies might have customized their backup plans to eliminate duplication, but there was no fast way to do this. That is when IT professionals began working on algorithms to automate the data deduplication process. They generally did this on a cases by case basis, with their goal to optimize their own backup files. Their algorithms would be customized to meet their own individual needs.

There was no one company that came up with the idea of data deduplication. Instead, the need to find a way to reduce duplicate files was a common need in the industry. There were many computer scientists who advanced data deduplication technology significantly, but there is no one scientist who was solely responsible for it. While many have claimed credit for the term ‘data deduplication’ itself, no one person can claim credit for the idea itself.

Instead, the creation of data deduplication algorithms was more of a compilation. People in the IT industry saw a need to reduce data duplicates and they filled that need to reduce those duplicated file by creating algorithms. As data increases, people will continue to find ways to compress data in a way that makes it easy to store.

Fix Your Corrupt PowerPoint Presentation by Using MS PowerPoint Recovery

Microsoft Office is one of the most common and popular software programs among all computer users. It is a software suite that contains several desktop applications for multiple purposes. As a computer user we use these applications on a daily basis. MS PowerPoint is one of them, which is used for creating and designing high-class presentation files. We all know the importance of presentation. It is widely used in almost every sector, for example:

  • For marketing promotion or corporate training session on businesses or large enterprises.
  • For teaching or training purposes in education sectors.

MS PowerPoint is one of the finest tools for creating presentation files. A presentation file may contain several pages, which are known as slides as they appear in sliding form. A slide of a presentation file may contain several file objects, such as: text, graphics, sound, movies and other objects. All these file objects make a presentation file more attractive and appealing. Microsoft PowerPoint saves presentation files in PPT or .ppt file extension. There are three types of file extension used by Microsoft PowerPoint:

  1. PPT: PPT file stores all presentation data in a single binary file. It is used by MS PowerPoint 2003 and its earlier versions.
  2. PPTX (Open XML): PPTX file is created by using the Open XML format. It stores the documents as a collection of separate files in compressed form. It can be opened in MS PowerPoint 2007 & 2010.
  3. PPTM (Open XML Macro Enabled): PPTM file, which is also known as macro-enabled presentation, contains embedded macros. It can be opened in MS PowerPoint 2007 & 2010.

It takes a lot of time and hard work to make a presentation file. A PPT file may be large as it contains several slides and various file-objects. Due to having a large size and complex structure it may get easily corrupted or damaged.

There is no computer file or application, which is immune to corruption. In the same way, presentation files are also prone to corruption. MS PowerPoint program becomes unable to open or read a corrupt file. There could be several unforeseen factors behind this corruption. Virus infection, unfinished or abruptly system shutdown, improper cancellation of presentation files or application, unexpected crash in system hard drive, software malfunction, human errors or mishandling a presentation, etc. are some of the common reasons of corruption. While opening a corrupt presentation file we may face some errors. Errors are the perfect indication of file corruption. Some of the most common errors are: “This is not a PowerPoint Presentation” or“File is corrupted or damaged”.

Whenever a file gets corrupted or damaged we need to repair it. We can use “Open and repair” to repair corrupt file, It is a built-in repair feature provided by MS Office to repair corrupt documents. It is one of the easiest and effective solutions to repair corrupt files. The steps, which are mainly required for this solution, are shown below:

  1. Open MS PowerPoint program. Click on Office button (on top of the left side).
  2. A list will be appearing. Click on Open.
  3. A small window will pop-up. Select corrupt file from the system directory.
  4. Click on the arrow of Open button. A list will be scrolled down.
  5. From the list, click on Open and repair. After some seconds, the file will be repaired and opened.

Note: Though this solution efficiently works with corrupt files, but if it does not work we must try the following solutions.

  • We should try to open a corrupt file in OpenOffice Impress.
  • We should try to open a corrupt file in MS Word. If it gets opened, we can recover at least the text part of the file.
  • We should try to import the slides from the damaged file into a new file.
  • If these solutions do not work, then we may use a third-party recovery tool for corrupt files. MS PowerPoint Recovery consists of PPT, PPTX and PPTM recovery tool, which can easily recover our data from corrupt PPT, PPTX and PPTM files. It does not overwrite or replace the old document. It creates a new file and saves all the recovered data into it. The tool works in self-describing mode as it is equipped with an automated wizard interface.

Battlefield 2 Crashes on Startup, Solve the Problem Easily

Unfortunately, there were some users facing problem with their purchased copies of the game. In some cases, the game was crashing when the Splash Screen was loading (i.e. at the startup), while in others, they were able to play it for approximately 2 minutes and then it starts crashing. We will discuss those issues those cause Battlefield 2 crash on startup.

Tested Solutions:

1. Reduce the Monitor Refresh Rate below 60 Hz

2. Update Graphics and Sound Drivers

3. Improve the Registry Conditions

4. Fix Bad Sectors on your Hard Disk

5. Exclude Battlefield 2 from Data Execution Prevention (DEP)

6. Lower the Screen Resolution

Reduce the Monitor Refresh Rate

Sometimes Battlefield 2 crashes on startup if the monitor refresh rate is above 60 Hz. You shall set the monitor refresh rate below 60 Hz. For doing so, perform the following steps:

1. Click Start | Control Panel.

2. Click Appearance and Personalization | Display | Screen Resolution.

3. Click Advanced Settings link.

4. Click Monitor tab in the new dialog.

5. From the Monitor Settings frame and Screen Refresh Rate drop down box, select the value 60 Hertz.

6. Click OK | OK | Close | Close.

7. Restart the game.

Update Graphics & Sound Drivers and DirectX

A computer running with outdated graphics and sound drivers often results in Battlefield 2 crashes on startup. Update all your drivers from the respective manufacturer’s website.

It also crashes if the DirectX is outdated. You will find the latest version of DirectX on Microsoft’s official website. Please avoid downloading it from third-party websites for the possibility of getting spyware.

Improve the Registry Conditions

This is one of the most common sources of this problem. Battlefield 2 crashes on desktop if the registry is not properly configured. The registry may not be properly set-up by the Battlefield Installer or the game itself.

You need to use a third-party registry fixing product. Be sure while selecting a good product. Because a wrong product may harm your system in extreme cases.

Fix Bad Sectors on your Hard Disk

In some cases, it was found that the game was stored on bad sectors of the hard disk. This result in Battlefield 2 crashes on startup, as it does not able to neither extract the resources nor retrieve or save information in files.

You need to check your hard disk for these bad sectors and repair them as follow:

1. Open My Computer.

2. Right click the drive where the game is installed. For instance, drive C.

3. Select Properties | Tools | Check Now.

4. Click Start button.

5. Follow the on screen instructions.

6. After completion, restart the game.

Exclude Battlefield 2 from Data Execution Prevention (DEP)

Data Execution Prevention (DEP) is a security feature included in modern Microsoft Windows operating systems that is intended to prevent an application or service from executing code from a non-executable memory region. This helps prevent certain exploits that store code via a buffer overflow.

Many times, DEP prevents Battlefield 2 from running smoothly. You need to disable the DEP in order to stop Battlefield 2 crashes on desktop.

1. Click Start.

2. Right click My Computer and select Properties.

3. Click Advanced System Settings link.

4. Click Advanced tab.

5. Click Settings from the Performance frame.

6. Click Data Execution Prevention tab.

7. Select the option Turn on DEP for all programs and services except those I select.

8. Click Add button.

9. Select the game files [especially Executable files (.exe)], and click Open to add in the list of the files that needs to be excluded.

10. Click OK | OK | Close.

11. Restart the game.

Lower the Screen Resolution

Battlefield 2 crashes on startup if you are running from an unsupported screen resolution. Lower the screen resolution as below.

1. Right click Desktop.

2. Select Screen Resolution.

3. Select a lower resolution from the Resolution drop down box.

4. Click OK.

5. Restart the game.

Experts Recommendation