Tuesday, September 29, 2009

COMDDAP EXPERIENCE


What is COMDDAP?

COMDDAP is an association of the country's top information technolgy businesses. Its primary objective is to promote the sustainable development of the of the country's information technology industry through voluntary collaboration of its member companies. It is the goal of COMDDAP to be able to provide its members general views and updates from different sectors - private and government, and highly regarded individuals to further uplift the morale and knowledge of its organization.

The vision of promoting and elevating the standards of Information technology (IT) in the Philippines fueled a group of prominent computer companies to form the Computer Distributors and Dealers Association of the Philippines or COMDDAP. Its initial member-companies represent the world's leading makers and providers of computer products, solutions and peripherals.

In 1997, the manufacturing sector - represented by industry leaders Hewlett Packard, Epson and Compaq, among others - was integrated into the COMDDAP membership, making the association a more diverse representation of the IT sector and thus the new name, COMPUTER MANUFACTURERS, DISTRIBUTORS AND DEALERS ASSOCIATION OF THE PHILIPPINES.

The annual COMDDAP endeavor encourages its participants to launch their products and services, as well as hold seminars with a variety of topics in the areas of technology, trends, applications, and information management.

COMDDAP's biggest leap was being able to cross boarders or go regional by bringing the exposition down south in Cebu and Davao City, and to even greater heights in the cities of Baguio and Naga, and most recently, in Iloilo City.

Apart from its exhibits, the organization holds COMDDAP Leaning Center Project and Training the Trainors program. This is part of the civic and social responsibility programs that the association has vowed to pursue. The project provides public high school students and public school teachers the opportunity to learn basic computer theories, get hands-on experience, and become familiar with computers.


My Experience..


It was July 2, 2009 when we visited the expo held at APO View Hotel, it was very interesting because there's a lot of new technologies displayed in the said event. With their seminars, we gain knowledge which are related to our course. I was amazed with the thin clients despite its slim size, the t5720 Thin Client is full of features: it’s powered by the AMD Geode NX 1500 processor with 512 MB of Flash memory and a standard complement of either 256 MB or 512 MB of DDR RAM.

What are the Key Features of an HP thin client?


• Reliability: Solid-state design means no moving parts, which results in higher reliability, lower ownership costs, and extended product life.

• Design security: HP makes it easy to lock down user settings and parameters on the client, or add a Smartcard reader for user authentication. Additionally, all critical user data and applications reside on your secure, centralized server. HP Sygate Security Agent is pre-installed on all t5720 Thin Clients.

• Improved manageability: HP’s alliance with Altiris brings a leading management solution to the thin client market. Altiris Deployment Solution’s standards-based, advanced thin client management solution helps reduce the costs of deploying, updating, and maintaining your thin clients. Free with each HP client!

• Open operating system: The HP Compaq t5720 Thin Client offers Genuine Windows XP Embedded operating system with Citrix ICA, Microsoft’s latest RDP client, Internet Explorer 6.0, Windows Media Player 9.0, and terminal emulations.

• Processing power: The processing power your workers need for a great server-based computing experience, whether running general office applications or your specific line of
business software on a terminal server.

• Smart graphics: Support for HP’s latest monitors with superior resolution, outstanding color, and high refresh rates.

• Invisible client: HP’s unique thermal design allows the t5720 to be mounted in many orientations. Recapture desktop space by utilizing HP Quick Release mounting solution. Chassis can be set either vertically or horizontally for positioning on a desktop or mounting on a wall, under a desk, or even on the back of a compliant monitor.

• Connectivity: Advanced connectivity features include a wide range of ports and support for options including smart card readers, modems, and other common options.

ASSIGNMENT 9



Information Environment

What is the Information Environment?

The Information Environment (IE) is a term used to refer to work to develop and provide services which enable people to find and manage information efficiently and effectively in their learning, teaching or research.
The information resources which people need are very varied - books, journals, research papers, teaching resources, videos, maps and more - and while they might be in any format they are increasingly digital.

There is now a critical mass of digital information resources that can be used to support researchers, learners, teachers and administrators in their work and study. The production of information is on the increase and ways to deal with this effectively are required. There is the need to ensure that quality information isn’t lost amongst the masses of digital data created everyday. If we can continue to improve the management, interrogation and serving of ‘quality’ information there is huge potential to enhance knowledge creation across learning and research communities. The aim of the Information Environment is to help provide convenient access to resources for research and learning through the use of resource discovery and resource management tools and the development of better services and practice. The Information Environment aims to allow discovery, access and use of resources for research and learning irrespective of their location.



A changing environment

Over time there has been a great deal of change to the context in which we are working. The Information Environment programmes have primarily worked with the over-riding aim of improving access to and use of heterogeneous resources but have also taken account of changes and provided a way to test, develop and evolve appropriate means to manage and use resources.

The significant environmental changes have been:

* moves towards Open Access research and learning being undertaken on the Web as the Web is able to support flexible models of research and learning
* dominance of search engines e.g. Google
* the development of ‘Web 2.0’ applications and services which support collaborative working and the creation and sharing of digital resources
* cloud computing
* the huge and continuing growth in digital 'data'
* increased awareness of the utility of service and resource orientated approaches to designing services

Cloud computing is a paradigm of computing in which dynamically scalable and often virtualized resources are provided as a service over the Internet. Users need not have knowledge of, expertise in, or control over the technology infrastructure in the "cloud" that supports them.

The rise in importance of repositories as a means of managing and sharing digital resources created as part of learning and research has meant that since 2005 much of the JISC Information Environment programme investment has focused on that area. Whilst in the last tranche of repository funding there was some work in the area of preservation, discovery, and other shared infrastructure, there is now a need to move away from the emphasis on 'repository' and to see repositories and the wide variety of resources and delivery mechanisms as part of the wider Information Environment context.




References:
http://www.jisc.ac.uk/whatwedo/themes/informationenvironment.aspx

ASSIGNMENT 8



Since outsourcing and insourcing were already defined by my classmates I will not include their definition anymore.
As for me, I would prefer insource. Because our school cannot provide such budget for the said project, its better to check our resources and allocate solutions that would suit our financial capabilities.

Insource, aside from being cheaper, is more secured than outsource. Though outsourcing can provide better competition that makes it a little cheaper, it would not outweigh the advantages given by insourcing.

Making use of our resources is better since we can communicate with the programmers easily if something went wrong in the system. As IT soon to be professionals, we know that the top priority of a system is security. Also, being concerned with the security doesn’t mean being careless with the quality.

Finally, the university has skilled faculty and students. I’m not bragging but it’s the reality that of all universities in Mindanao, our school offers high standards of learning when it comes to Information Technology. So why settle for the better if we can have the best?

here are some facts about insourcing and outsourcing to be fair:

Outsourcing is subcontracting a process, such as product design or manufacturing, to a third-party company. The decision to outsource is often made in the interest of lowering cost or making better use of time and energy costs, redirecting or conserving energy directed at the competencies of a particular business, or to make more efficient use of land, labor, capital, (information) technology and resources.

Outsourcing became part of the business lexicon during the 1980s. It is essentially a division of labour. Outsourcing in the information technology field has two meanings. One is to commission the development of an application to another organization, usually a company that specializes in the development of this type of application. The other is to hire the services of another company to manage all or parts of the services that otherwise would be rendered by an IT unit of the organization. The latter concept might not include development of new applications.

Outsourcing involves the transfer of the management and/or day-to-day execution of an entire business function to an external service provider. The client organization and the supplier enter into a contractual agreement that defines the transferred services. Under the agreement the supplier acquires the means of production in the form of a transfer of people, assets and other resources from the client.

The client agrees to procure the services from the supplier for the term of the contract. Business segments typically outsourced include information technology, human resources, facilities, real estate management, and accounting. Many companies also outsource customer support and call center functions like telemarketing, CAD drafting, customer service, market research, manufacturing, designing, web development, print-to-mail, content writing, ghostwriting and engineering. Offshoring is the type of outsourcing in which the buyer organization belongs to another country.

Outsourcing and offshoring are used interchangeably in public discourse despite important technical differences. Outsourcing involves contracting with a supplier, which may or may not involve some degree of offshoring. Offshoring is the transfer of an organizational function to another country, regardless of whether the work is outsourced or stays within the same corporation/company.

With increasing globalization of outsourcing companies, the distinction between outsourcing and offshoring will become less clear over time. This is evident in the increasing presence of Indian outsourcing companies in the United States and United Kingdom. The globalization of outsourcing operating models has resulted in new terms such as nearshoring, noshoring, and rightshoring that reflect the changing mix of locations. This is seen in the opening of offices and operations centers by Indian companies in the U.S. and UK. A major job that is being outsourced is accounting. They are able to complete tax returns across seas for people in America.

Multisourcing refers to large outsourcing agreements (predominantly IT). Multisourcing is a framework to enable different parts of the client business to be sourced from different suppliers. This requires a governance model that communicates strategy, clearly defines responsibility and has end-to-end integration.

Strategic outsourcing is the organizing arrangement that emerges when firms rely on intermediate markets to provide specialized capabilities that supplement existing capabilities deployed along a firm’s value chain (see Holcomb & Hitt, 2007). Such an arrangement produces value within firms’ supply chains beyond those benefits achieved through cost economies. Intermediate markets that provide specialized capabilities emerge as different industry conditions intensify the partitioning of production. As a result of greater information standardization and simplified coordination, clear administrative demarcations emerge along a value chain. Partitioning of intermediate markets occurs as the coordination of production across a value chain is simplified and as information becomes standardized, making it easier to transfer activities across boundaries.

Due to the complexity of work definition, codifying requirements, pricing, and legal terms and conditions, clients often utilize the advisory services of outsourcing consultants (see sourcing advisory) or outsourcing intermediaries to assist in scoping, decision making, and vendor evaluation.

Insourcing is the opposite of outsourcing; that is insourcing (or contracting in) is often defined as the delegation of operations or jobs from production within a business to an internal (but 'stand-alone') entity that specializes in that operation. Insourcing is a business decision that is often made to maintain control of critical production or competencies. An alternate use of the term implies transferring jobs to within the country where the term is used, either by hiring local subcontractors or building a facility.

Insourcing is widely used in an area such as production to reduce costs of taxes, labor (e.g., American labor is often cheaper than European labor), transportation, etc.

Insourcing at United Parcel Service (UPS) was described in the bestselling book The World Is Flat, by Thomas Friedman.

According to PR Web, insourcing was becoming more common by 2006 as businesses had less than satisfactory experiences with outsourcing (including customer support). Many outsourcing proponents responded to a negative consumer opinion backlash resulting from outsourcing their communications management to vendors who rely on overseas operations.

To those who are concerned that nations may be losing a net amount of jobs due to outsourcing, some point out that insourcing also occurs. According to a study by Mary Amiti and Shang-Jin Wei, in the United States, the United Kingdom, and many other industrialized countries more jobs are insourced than outsourced. They found that out of all the countries in the world they studied, the U.S. and the U.K. actually have the largest net trade surpluses in business services. Countries with a net deficit in business services include Indonesia, Germany and Ireland.

Insourcing is loosely referred in call centers who are doing the work of the outsourcing companies. Companies that outsource include Dell, Hewlett Packard, Symantec, and Linksys. The callcenters and technicians that are contracted to handle the outsourced work are usually over-seas. Customers may refer to these countries as "India" technical support if they are hard to understand over telecommunications. These insourcing companies were a great way to save money for the outsourcing of work, but quality varies, and poor performance has sometimes harmed the reputations of companies who provide 24/7 customer/technical support.

I suggest our school should go for insourcing in favor of the reasons and benefits stated above.

ASSIGNMENT 7



"Kung noong nakaraan, lumakas ang electronics, today we are creating wealth by
developing the BPO and tourism sectors as additional engines of growth. Electronics and
other manufactured exports rise and fall in accordance with the state of the world
economy. But BPO remains resilient. With earnings of $6 billion and employment of
600,000, the BPO phenomenon speaks eloquently of our competitiveness and
productivity. Let us have a Department of ICT.”

-BPO (Business process outsourcing) with this more Filipinoes have the oppurtunity to have a good job and and increasing our competitiveness and productivity.


"Sa telecommunications naman, inatasan ko ang Telecommunications Commission na kumilos na tungkol sa mga sumbong na dropped calls at mga nawawalang load sa cellphone. We need to amend the Commonwealth-era Public Service Law. And we need to do it now….”

- A dropped call pertains to an irregularly disconnected call. A call attempted but dropped before six seconds after the called party answers should not be considered a call. The agency noted that blocked and dropped calls were caused by network congestion and system failure. In amending the commonwealth-era Public Service Law, consumers will be grateful with this because it is advantage for us.


"As the seeds of fundamental political reform are planted, let us address the highest exercise of democracy, voting! In 2001, I said we would finance fully automated elections. We got it, thanks to Congress.”


-It provides efficient way for voting and fast and reliable results

ASSIGNMENT 6




1. Boost computer speed

It is important to use high-speed computers in order for our connection to be fast.
But since the university lack the budget to buy or even just upgrade our computers,
it is not practical for us to implement this.

2. Use software that speeds up any internet connection

for example:

WEBROCKET

webROCKET is a powerful, easy-to-use program for Windows® 95, 98, Me, NT,
2000, and XP which accelerates your Internet connection speed by up to 200%.

What does webROCKET do?
Without webROCKET, Windows® lacks the power to provide you with an optimal
Internet connection because of changing, unstable network conditions.

webROCKET automatically turbo charges your Internet connection by boosting
Internet data transport efficiency. webROCKET adapts your modem or high-speed
connection to its maximum potential.

webROCKET is compatible with any home or office Internet connection that works
in Windows®, including dial-up modems of any speed, and high-speed connections
such as cable modems, DSL, ISDN, T-1, LAN, etc. It works with all Internet services,
including AOL and local ISPs.

The disadvantage of using accelerators is that the quality of the interface will be
decreased. Not a bad option for us since our concern is only to speed up internet
connection. But if i were to ask, i would not sacrifice quality for speed.

3. Use repeaters or routers

Network repeaters regenerate incoming electrical, wireless or optical signals.
With physical media like Ethernet or Wi-Fi, data transmissions can only span a
limited distance before the quality of the signal degrades. Repeaters attempt to
preserve signal integrity and extend the distance over which data can safely travel.

Thus, if CAS to ENG'G building or to other buildings exceeds the max distance,
we can use repeaters to regenerate the signal.

Routers are specialized computers that send your messages and those of every
other Internet user speeding to their destinations along thousands of pathways.

So if we use routers, best path will evidently accelerate internet speed. Our school
is currently using routers but internet connection was still very low.

4. Careful selection and planning of medium to be use.

Since we are already using fiber-optic cables and UTP cables (if I'm right), we should
maintain and maximize the use of these mediums in order for the data to travel faster.

5. Increase in bandwidth

We've already increased our bandwidth but still our connection hasn't worked out well
yet. So i think it is not the solution to our problem because increase bandwidth
will cost us much.

6. Proper usage

Ask we've discuss in our MIS subject, we've learned that no matter how fast or
advance the technology your using, it will be useless if the users doesn't use their minds...
What i mean is, these are only tools, and we are the users of these tools.
As I searched the web, here are some tips on how to improve internet connectivity:
Broadband Connection Speed
The only thing better than a fast broadband Internet connection, is a faster broadband Internet connection. Broadband Internet speed tests allow you to measure your current broadband speed against that of faster broadband Internet connections. There are various programs and software packages that you can purchase through which you can increase the speed of your Internet connection. You can also make adjustments to the hardware components (Upgrade processor speed and memory levels) of your system maximizing your computer’s broadband connection potential.

If you’re not looking to purchase additional software / hardware add-ons, there are manual “tweaks” that you can make to your system through which you can boost your broadband speed.
Increasing Your Broadband Internet Speed
Let’s assume that you access the Internet via a broadband LAN line. The following are 3 examples of ways through which you can manipulate your network settings and increase your broadband speed:
Reduce your network latency by increasing the request buffer size
Tests of LAN broadband connections have shown that delays can be caused as a result of the default request buffer size setting of 4356 decimal. As it is, it has been proven that an increase to a 16384 decimal setting can allow for better performance. (Such an increase is only possible if you have the necessary memory) By utilizing this slight “tweak,” you can noticeably increase your Internet speed and broadband networking capabilities.
Altering your network task scheduler
If you’ve encountered long waits when trying to open network folders, then this “tweak” is for you. One of the default settings with broadband networking is that when you open a network folder, your system performs a test of the networked computer in order to search for scheduled tests. By disabling this option for a LAN connection, you can increase your broadband networking speed.
Increasing your network transfer rate
Transfer rate, also referred to as throughput, refers to the speed at which data can be moved from 1 location to another. Network redirector buffers serve the purpose of optimizing your disk performance, and therefore allowing for the fastest possible broadband networking speed. If you increase the number of network redirector buffers functioning on your system, you could greatly increase your throughput. An Internet speed test following this change will yield noticeable results.
Internet Speed Tests
If you are looking to perform an Internet speed test for your system, there are various tools online through which you can provide your ISP, your area code, your connection type, etc., and receive a reading of your broadband speed compared to the top providers in your area. This allows for you to realize if your broadband speed is lacking in comparison to others, and work to maximize your broadband networking potential. This can be achieved through implementation of the above tweaks or through hardware upgrades and software purchases.
How to Increase Internet Connection Speed
Instructions
Things You'll Need:
• Computer
• Internet connection
• Web browser
• ISP telephone number
1. Step 1
Find out from your Internet Service Provider (ISP) what Internet connection speed your paying for. Make sure the speed your paying for is the speed programmed in their network.
2. Step 2
Speedtest.net
Test your Internet connection speed. You can do this by going to one of these speed test websites: Speakeasy.net/speedtest or Speedtest.net. Record your results.
3. Step 3
Compare the speeds from step one and step two. If your getting the speed your paying for go no further. If your not, go to the next step.
4. Step 4
Manage Add-ons screen
Disable web-browser Add-ons that can slow down your Internet connection speed. Check to see if you have multiple web browser Add-ons operating with your browser. For example, if your web browser is Internet Explorer , go to Tools, select Manage Add-ons, and look at what Add-ons are enabled. Disable the ones you do not want to use.
5. Step 5
Run anti-virus, adware, spyware, and malware scans. All of these, if found on your computer, could negatively affect your Internet connection speed.
6. Step 6
Run Disk Cleanup and Disk Defragmenter from your System Tools menu.
7. Step 7
SG TCP Optimizer
Download TCP Optimizer software to optimize your computers MTU (Maximum Transmission Unit) values, RWIN (Receive Window) values, and broadband related registry keys. The most popular and FREE TCP Optimizer that I found is called "SG TCP Optimizer". You can download it at CNET: http://www.download.com/SG-TCP-Optimizer/3000-2155_4-10488572.html?tag=lst-1 or at PCWORLD: http://www.pcworld.com/downloads/file/fid,68524-order,1-page,1/description.html.
8. Step 8
Speakeasy.net/speedtest/
Retest your Internet connection speed by going to one of these speed test websites: Speakeasy.net/speedtest or Speedtest.net. Record and compare these results with the results obtained from steps one and two.
Step 1: Modifying your LAN Properties.

a) Go to Start>Network Connections.

b) Right Click>Properties on your main Internet connection.

c) Make the following changes: [Unselect everything except "Internet Protocol (TCP/IP)"





Step 2: Running Lvllords Patch.

In Win XP SP 1, TCP connections were set to unlimited. However, Microsoft limited these connections to 10 in SP2. You can open more TCP connections and give your speed an all time high by following these steps:

a) Extract the patch and run it:
b) Press "C" to change the limit and set it to 100.

c) Press "Y". After you do that, you'll be prompted with a Windows XP message saying that your original files are being replaced and blah blah. Thats normal so DONT PANIC. Click "Ignore" or "Cancel" to that window, and press "Yes" after it asks for a confirmation.

d) You'll get a message that your patch was successfully executed. Exit and reboot your computer. Don't forget to Bookmark this page (Hit CTRL+D) so you can return once you've rebooted.


3) SG TCP Optimizer - using it!

Back already? Cool! So, this is where you use SpeedGuide's TCP Optimizer. How? Its your lucky day, my friend.

a) This program modifies your system registry (Nothing to worry about!) in order to boost your Internet speed. Here's what you have to do:
b) Choose your connection speed - mine is 256 kbps. Go down and select "Optimal Settings". Click on Apply changes after that. You'll be promoted with a box somewhat like this after that:

c) Click OK, and click "Yes" to reboot your computer. Thats all you need to have a full-on High Speed Internet Connection! Enjoy browsing, and don't forget to check out other tweaks & tricks present on this site. (Hint: Bookmark this page in order to make an easy comeback).

ASSIGNMENT 5


Organizations are as alike and unique as human beings. Similarly, group processes can be as straightforward or as complex as the individuals who make up the organization. It is vital to successfully launching a new program that the leaders understand the strengths, weaknesses, and idiosyncrasies of the organization or system in which they operate. Try to anticipate barriers to implementation so that you can develop strategies to minimize their impact or avoid them altogether. The following list of common barriers can be used to help your leadership team identify potential obstacles. The list of essential elements for change can help the team brainstorm possible solutions.


A barrier is an obstacle which prevents a given policy instrument being implemented, or limits the way in which it can be implemented. In the extreme, such barriers may lead to certain policy instruments being overlooked, and the resulting strategies being much less effective. For example, demand management measures are likely to be important in larger cities as ways of controlling the growth of congestion and improving the environment. But at the same time they are often unpopular, and cities may be tempted to reject them simply because they will be unpopular. If that decision leads in turn to greater congestion and a worse environment, the strategy will be less successful. The emphasis should therefore be on how to overcome these barriers, rather than simply how to avoid them.

How should we deal with barriers in the short term?

It is important not to reject a particular policy instrument simply because there are barriers to its introduction. One of the key elements in a successful strategy is the use of groups of policy instrument which help overcome these barriers.

How can we overcome barriers in the longer term?

It is often harder to overcome legal, institutional and technological barriers in the short term. There is also the danger that some institutional and political barriers may get worse over time. However, strategies should ideally be developed for implementation over a 15-20 year timescale. Many of these barriers will not still apply twenty years hence, and action can be taken to remove others. For example, if new legislation would enable more effective instruments such as pricing to be implemented, it can be provided. If split responsibilities make achieving consensus impossible, new structures can be put in place. If finance for investment in new infrastructure is justified, the financial rules can be adjusted.



Operational problems
* Need for co-ordination and networking among professionals and educational institutions
* Need for supportive policy to release information
*Insufficient access to information source
*Confidentiality
*Redundancy of information
*Nature of policy directives
*Need for identifying sources of information
*Centralization of activities
*Need for systematic documentation


Operational improvement

* Set clear policy guidelines on information dissemination
* Encourage government to have depository laws and enforce them
* Introduce information system
* Use mass media
* Follow a bottom-up approach
* Develop grassroots level inventory of information
* Create awareness of the value of information
* Identify user information needs
* Consult target groups
* Develop target-oriented and useable information
* Develop effective system of information management and dissemination; information should be simple, understandable and manageable
* Institute efficient and effective co-ordination and networking
* Encourage a free flow of information — horizontally and vertically

Financial barriers

The figure below provides perceptions of the severity of financial barriers for European cities. It suggests that road building and public transport infrastructure are the two policy areas which are most commonly subject to financial constraints, with 80% of cities stating that finance was a major barrier. Information provision, again, was the least affected in terms of financial constraints. The only differences by city size are that small cities are less likely to perceive financial constraints on land use policies, and large cities are even less likely to identify financial constraints on information measures.

Political barriers

The figure below summarises information on political barriers for European cities. It suggests that road building and pricing are the two policy areas which are most commonly subject to acceptability constraints, with around 50% of cities stating that acceptability was a significant constraint on road building and pricing measures. Public transport operations and information provision were the least affected by acceptability constraints. Generally, large and small cities were more likely than medium sized cities to identify political barriers. Large cities were much more likely to perceive such barriers for road and rail infrastructure projects; small cities were more likely to identify them for pricing measures.

Practical and technological barriers

While cities view legal, financial and political barriers as the most serious which they face in implementing land use and transport policy instruments, there are some concerns also over practical limitations. For land use and infrastructure these may well include land acquisition. For management and pricing, enforcement and administration are key issues. For infrastructure, management and information systems, engineering design and availability of technology may limit progress. No attempt was made to survey cities' views on these, since they are very specific to individual instruments.



References:
http://www.mywhatever.com/cifwriter/content/22/4481.html
http://www.elseviersocialsciences.com/transport/konsult/public/level1/l1_barr.htm

ASSIGNMENT 4



http://www.campuscomputing.net/summaries/2000/index.html

The growing demand for IT talent across all sectors of the booming economy poses significant staffing challenges for US colleges and universities, according to new data from The Campus Computing Project. Campus IT officials place “retaining current IT personnel given off-campus competition” and “helping IT personnel stay current with new technologies” at the top of the list of 27 strategic, budget, and personnel issues confronting their institutions over the next two-three years.



http://www.it.utah.edu/leadership/green/index.html



Green computing techniques are easy to incorporate and will result in:

* A reduction in overall operating costs by reducing power use, using shared hardware resources, reusing similar systems, and reducing supplies such as toner, ink and paper.
* Enhanced work environments such as campus computer lab space and office work space with reduced noise pollution and eye strain from traditional CRTs.
* Corporate and social responsibility through a focus on the Triple Bottom Line, an expanded set of success values focusing on people, planet and profit.
* An enhanced University Image: green computing solutions on the U campus can be used as marketing tools for potential students and researchers.


http://inews.berkeley.edu/articles/Spring2009/green-computing



Green computing: Decreasing IT-related energy consumption on campus


Anyone who reads blogs or billboards in Berkeley knows energy efficiency matters to the campus and community. At UC Berkeley, energy efficiency relates to the "triple bottom line" philosophy, the view that social, environmental, and economic consequences should all be considered when planning for the future. Chancellor Birgeneau has committed the University to lowering its greenhouse gas emissions to 1990 levels by 2014, a target six years ahead of California's statewide goal. But this is a tall order, as much has changed on campus in the last 19 years. For one, the role computers play has dramatically increased — and, despite the improved efficiency of newer machines, so has the energy that computing consumes.

ASSIGNMENT 3


Fully automated 2010 elections in peril


MANILA, Philippines—The Commission on Elections may have to abandon its plan to fully automate the 2010 elections because of Congress’ delay in passing the P11.3-billion automation budget has compromised election preparations, according to Comelec Chair Jose Melo.

Melo said the Comelec is considering the idea of partially computerizing the 2010 polls as the delay in the release of the budget has considerably reduced the time to prepare for the country’s first-ever national computerized elections.

“We are really pressed for time. We are toying with the idea that it won’t be full capacity. Maybe we will automate 50 percent, not nationwide,” he said.

The Comelec had planned on fully computerizing the voting and canvassing of the 2010 presidential and national elections, as it is mandated by law.

Faster count, less fraud

Automating the process will also mean a faster count and less human intervention, which could lead to fraud, it said.

It asked Congress to pass the P11.3-billion supplemental budget for poll automation before legislators go on a break on March 6 to give the poll body ample time to prepare for the bidding for the machines.

But complaints and questions from the House of Representatives about the automation process has stalled the passage of the supplemental budget.

The Senate finance committee deferred the approval of the bill pending its final approval by the House. Both chambers are taking a month-long break beginning next week, and resume sessions on April 13.

Sen. Edgardo Angara, the finance committee chair, said it was likely Congress would pass the appropriations bill when it resumes sessions on April 13.

MalacaƱang on Thursday reminded legislators that President Gloria Macapagal-Arroyo wants to leave automated elections as her legacy amid concerns over the delayed passage of a budget for the process.

Racing against time

Deputy Presidential Spokesperson Anthony Golez said the P11.3-billion supplemented budget for poll automation had been certified as urgent by Ms Arroyo.

Golez said that the strongest signal sent by the President to congressmen and senators to pass the measure was her certification of it as urgent.

“We hope and pray for the wisdom of Comelec and Congress to come to terms to achieve the goal,” he said.

The Comelec is racing against time to get the budget passed by April 12 so the contract can be awarded by May.

Melo said Congress should guarantee that the bill would be passed on April 13 as the poll preparations are already behind schedule.

He said there may not be enough time for the suppliers to order and configure 80,000 machines for the nationwide exercise if Congress passes the budget bill on mid-April.

The Comelec is also pessimistic that it can train enough technical people on time, the poll chief said

Is The Philippines Ready for an Automated Election System?


The computerized or automated election in the Autonomous Region in Muslim Mindanao (ARMM) scheduled on August 11, 2008 is being threatened by the Moro Islamic Liberation Front (MILF). On the other hand, this article is not about politics in the Philippines. This is about the computerization of the Philippine election system.

Are Filipinos ready for automation? Let me give you some facts about the Autonomous Region in Muslim Mindanao (ARMM) then give your thoughts if the Philippines is ready for an automated election system (AES) as mandated by law (RA 9369 - Automated Election Law):

The 2003 functional literacy survey of the National Statistics Office (NSO) showed ARMM as having the lowest basic literacy rate in the country, with 30 percent of its people aged 10-64 years old considered illiterate.
On a national level, one in 10 Filipinos can not read and write, according to the survey.
Ustadz Esmael Ibrahim of the Assembly of Darul Ifta of the Philippines said illiteracy in the ARMM is worst in Sulu, with 40 percent of its people unlearned.
In addition, according to reports, two voting technologies will be used in the ARMM elections - Direct Recording Electronic (DRE) in Maguindanao, and Optical Mark Reader (OMR) in Lanao del Sur, Basilan, Sulu, and Tawi-Tawi. More than 3,000 DRE machines and 156 OMR counting machines will be delivered to ARMM.

According to Comelec, “DRE uses electronic ballot, records votes by means of a ballot display provided with mechanical or electro optical components that can be activated by the voter, processes data by means of a computer program, records voting data and ballot images, and transmits voting results electronically.”

The automated ARMM election is a pretest to the 2010 Presidential elections in the Philippines. If this test succeeds, then for sure the Automated Election System will be used. If not, then maybe the Philippine government will consider going back to the “control” method which is the conventional election most Filipinos are used to or improve any weaknesses that will be identified in the implementation of the computerized election process.

This brings to mind the question, “How reliable can the computerized (automated) election system be?” knowing that anything electronic is much easier to falsify. Will the election finally put an end to the “dagdag-bawas” dilemma in our nation’s election results? Or, will the automation process make it much easier to fake election returns? Is the software in the machines in the automated election properly tested and proven bug-free? That we will find out after the ARMM elections. Let the “trial-and-error” in our election process begin on August 11, 2008.




By Kristine L. Alave, TJ Burgonio
Philippine Daily Inquirer
First Posted 06:03:00 02/27/2009

Filed Under: Eleksyon 2010, Elections, Computing & Information Technology, Congress, State Budget & Taxes, Politics




references:
http://www.jpsimbulan.com/2008/08/06/is-the-philippines-ready-for-an-automated-election-system/
http://newsinfo.inquirer.net/inquirerheadlines/nation/view/20090227-191319/Fully-automated-2010-elections-in-peril

ASSIGNMENT 2



Based on the organization(s) that you visited, what do you think are the risks associated with business and IS/IT change?



We have visited GH Office Depot to conduct an interview regarding with the risks associated with business and IS/IT change,the IT staff said that it has a big effect if there is a changed in IS/IT,most of their transactions used technology with that a sudden changed would affect their business.

Given the serious security risks to information technology (IT) assets, managing those risks effectively is an essential task for the University and its departments. The process is one that will benefit both the individual department and the University as a whole. Completing such a risk management process is extremely important in today’s advanced technological world. It is important that management understand what risks exist in their IT environment, and how those risks can be reduced or even eliminated.

Like fire insurance, ITS-RM is a form of protection that the University simply can not afford not to have. The University has business processes, research and instructional efforts, and legally protected data that depend on IT assets, which UVa cannot afford to lose or have exposed. Unfortunately, these IT assets are subject to an increasing number of threats, attacks and vulnerabilities, against which more protection is continually required. The ITS-RM program is an essential component in this overall effort.

Although the IT Security Risk Management (ITS-RM) program will likely be welcomed by departments that have already experienced loss of mission-critical IT resources, many will not fully appreciate the need for assessment and planning. Consequently, a University policy regarding participation is necessary.



Effects of Technology on Business

Businesses have been at the forefront of technology for ages. Whatever can speed production will draw in more business. As computers emerged in the 20th century, they promised a new age of information technology. But in order to reap the benefits, businesses needed to adapt and change their infrastructure [source: McKenney]. For example, American Airlines started using a computerized flight booking system, and Bank of America took on an automated check-processing system.
Obviously, now, most business is conducted over personal computers or communication devices. Computers offer companies a way to organize dense databases, personal schedules and various other forms of essential information

As information travels faster and faster and more reliably, barriers of distance disappear, and businesses are realizing how easy it is to outsource jobs overseas. Outsourcing refers to the practice of hiring employees who work outside the company or remotely -- and even halfway across the world. Companies can outsource duties such as computer programming and telephone customer service. They can even outsource fast-food restuarant service -- don't be surprised if you're putting in your hamburger order with a fast-food employee working in a different country entirely. Outsourcing is a controversial practice, and many believe that U.S. companies who take part are hurting the job market in their own country. Nonetheless, from a business perspective, it seems like the wisest route, saving companies between 30 and 70 percent [source: Otterman].

Another technology that's starting to revolutionize business is actually not very new -- it's just cheaper these days. Radio frequency identification (RFID) technology is infiltrating and changing business significantly in a few ways. Microchips that store information (such as a number equivalent of a barcode and even an up-to-date history of the chip's travels) can be attached to product, and this helps companies keep track of their inventory.

Some businesses have even begun to use RFID chip implants in humans to tighten security. An access control reader detects the chip's signal and permits the employee access to the door. But many people are concerned about privacy issues if this were to become widespread practice.


references:

http://www.itc.virginia.edu/security/riskmanagement/
http://communication.howstuffworks.com/technology-changed-business1.htm


karl © 2008. Free Blogspot Templates Sponsored by: Tutorial87 Commentcute