Error -107439611​3 Convert Pixel To Real World During Camera Feed

Hello,
I am fairly new to LabView and am encountering an error that I can't seem to solve.
I am utilizing an image feed from a firewire camera to detect the real world location of a white line.  I have modified a tested, existing VI that processes the stream into individual image files, which I then pass to the VIs I wrote.
Using the NI Vision Assistant, I created a script that extracts a color plane and processes the image to a binary image, where "off" is blank space and "on" is the line I am trying to find.  I then pass this image onto another VI, where I am encountering the error "Error -1074396113 occurred at IMAQ Convert Pixel To Real World.  The source/input image has not been calibrated" when I attempt to convert the binary image into real world coordinates using the "convert pixel to real world" VI. 
I have calibrated my camera per instructions in the NI Vision Assistant.  Strangely, once the program encounters the error and I stop execution of the VI, then re-run just the image processing VIs on the static image still retained in the wire, everything executes properly.
I would appreciate any help in this matter,
Thank you,
German Coast Guard

Have a look at Andrey's reply in this thread:
http://forums.ni.com/ni/board/message?board.id=200​&message.id=18963&query.id=75915#M18963

Similar Messages

  • How to convert pixel Measurment values to real world cordinate systems ( 2D & 3D)

    Hello  All
    I am very new to Image processing and I stuck in a position where i have to convert the pixel measurement value in the real world X,Y,Z  coordinates, and after then that value can be send to the robot arm.
    For more understanding i have attached the image of my VI and also the Labview VI, you can also suggest if something more is missing then what i already did
    I need help to solve this problem, as i already mentioned i am very new to image processing so answer in a simple explanations would be appreciated
    Thanks in Advance
    Attachments:
    Object_detection_Pixel.vi ‏48 KB

    Hello,
    I would like to help you with this topic.
    I have one question: have you tried to run the VI with highlight mode or with probes to see what happens step by step?
    Concerning the broken wire, as far as I can see it, its just the dimensions of the output array which does not fit.
    The data is an array of the dimension 2 but you connected an array with dimension 1.
    So LabVIEW will not run the VI, I posted a screenshot with the new Array in the BD. As you can see, the wire now has two lines instead of one.
    You can always add suitable FP indicators by right-clicking -> create -> indicator.
    Please let me know if this is useful for you.
    I am looking forward to hearing from you!
    Have a nice day,
    Christopher W.
    Intern Application Engineering | NI Certified LabVIEW Associate Developer (CLAD) | NI Germany
    Attachments:
    array-wire.png ‏24 KB

  • Making Effective Use of the Hybrid Cloud: Real-World Examples

    May 2015
    Explore
    The Buzz from Microsoft Ignite 2015
    NetApp was in full force at the recent Microsoft Ignite show in Chicago, and it was clear that NetApp's approach to hybrid cloud and Data Fabric resonated with the crowd. NetApp solutions such as NetApp Private Storage for Cloud are solving real customer problems.
    Hot topics at the NetApp booth included:
    OnCommand® Shift. A revolutionary technology that allows you to move virtual machines back and forth between VMware and Hyper-V environments in minutes.
    Azure Site Recovery to NetApp Private Storage. Replicate on-premises SAN-based applications to NPS for disaster recovery in the Azure cloud.
    Check out the following blogs for more perspectives:
    Microsoft Ignite Sparks More Innovation from NetApp
    ASR Now Supports NetApp Private Storage for Microsoft Azure
    Four Ways Disaster Recovery is Simplified with Storage Management Standards
    Introducing OnCommand Shift
    SHIFT VMs between Hypervisors
    Infront Consulting + NetApp = Success
    Richard Treadway
    Senior Director of Cloud Marketing, NetApp
    Tom Shields
    Senior Manager, Cloud Service Provider Solution Marketing, NetApp
    Enterprises are increasingly turning to cloud to drive agility and closely align IT resources to business needs. New or short-term projects and unexpected spikes in demand can be satisfied quickly and elastically with cloud resources, spurring more creativity and productivity while reducing the waste associated with over- or under-provisioning.
    Figure 1) Cloud lets you closely align resources to demand.
    Source: NetApp, 2015
    While the benefits are attractive for many workloads, customer input suggests that even more can be achieved by moving beyond cloud silos and better managing data across cloud and on-premises infrastructure, with the ability to move data between clouds as needs and prices change. Hybrid cloud models are emerging where data can flow fluidly to the right location at the right time to optimize business outcomes while providing enhanced control and stewardship.
    These models fall into two general categories based on data location. In the first, data moves as needed between on-premises data centers and the cloud. In the second, data is located strategically near, but not in, the cloud.
    Let's look at what some customers are doing with hybrid cloud in the real world, their goals, and the outcomes.
    Data in the Cloud
    At NetApp, we see a variety of hybrid cloud deployments sharing data between on-premises data centers and the cloud, providing greater control and flexibility. These deployments utilize both cloud service providers (CSPs) and hyperscale public clouds such as Amazon Web Services (AWS).
    Use Case 1: Partners with Verizon for Software as a Service Colocation and integrated Disaster Recovery in the Cloud
    For financial services company BlackLine, availability, security, and compliance with financial standards is paramount. But with the company growing at 50% per year, and periodic throughput and capacity bursts of up to 20 times baseline, the company knew it couldn't sustain its business model with on-premises IT alone.
    Stringent requirements often lead to innovation. BlackLine deployed its private cloud infrastructure at a Verizon colocation facility. The Verizon location gives them a data center that is purpose-built for security and compliance. It enables the company to retain full control over sensitive data while delivering the network speed and reliability it needs. The colocation facility gives Blackline access to Verizon cloud services with maximum bandwidth and minimum latency. The company currently uses Verizon Cloud for disaster recovery and backup. Verizon cloud services are built on NetApp® technology, so they work seamlessly with BlackLine's existing NetApp storage.
    To learn more about BlackLine's hybrid cloud deployment, read the executive summary and technical case study, or watch this customer video.
    Use Case 2: Private, Nonprofit University Eliminates Tape with Cloud Integrated Storage
    A private university was just beginning its cloud initiative and wanted to eliminate tape—and offsite tape storage. The university had been using Data Domain as a backup target in its environment, but capacity and expense had become a significant issue, and it didn't provide a backup-to-cloud option.
    The director of Backup turned to a NetApp SteelStore cloud-integrated storage appliance to address the university's needs. A proof of concept showed that SteelStore™ was perfect. The on-site appliance has built-in disk capacity to store the most recent backups so that the majority of restores still happen locally. Data is also replicated to AWS, providing cheap and deep storage for long-term retention. SteelStore features deduplication, compression, and encryption, so it efficiently uses both storage capacity (both in the appliance and in the cloud) and network bandwidth. Encryption keys are managed on-premises, ensuring that data in the cloud is secure.
    The university is already adding a second SteelStore appliance to support another location, and—recognizing which way the wind is blowing—the director of Backup has become the director of Backup and Cloud.
    Use Case 3: Consumer Finance Company Chooses Cloud ONTAP to Move Data Back On-Premises
    A leading provider of online payment services needed a way to move data generated by customer applications running in AWS to its on-premises data warehouse. NetApp Cloud ONTAP® running in AWS proved to be the least expensive way to accomplish this.
    Cloud ONTAP provides the full suite of NetApp enterprise data management tools for use with Amazon Elastic Block Storage, including storage efficiency, replication, and integrated data protection. Cloud ONTAP makes it simple to efficiently replicate the data from AWS to NetApp FAS storage in the company's own data centers. The company can now use existing extract, transform and load (ETL) tools for its data warehouse and run analytics on data generated in AWS.
    Regular replication not only facilitates analytics, it also ensures that a copy of important data is stored on-premises, protecting data from possible cloud outages. Read the success story to learn more.
    Data Near the Cloud
    For many organizations, deploying data near the hyperscale public cloud is a great choice because they can retain physical control of their data while taking advantage of elastic cloud compute resources on an as-needed basis. This hybrid cloud architecture can deliver better IOPS performance than native public cloud storage services, enterprise-class data management, and flexible access to multiple public cloud providers without moving data. Read the recent white paper from the Enterprise Strategy Group, “NetApp Multi-cloud Private Storage: Take Charge of Your Cloud Data,” to learn more about this approach.
    Use Case 1: Municipality Opts for Hybrid Cloud with NetApp Private Storage for AWS
    The IT budgets of many local governments are stretched tight, making it difficult to keep up with the growing expectations of citizens. One small municipality found itself in this exact situation, with aging infrastructure and a data center that not only was nearing capacity, but was also located in a flood plain.
    Rather than continue to invest in its own data center infrastructure, the municipality chose a hybrid cloud using NetApp Private Storage (NPS) for AWS. Because NPS stores personal, identifiable information and data that's subject to strict privacy laws, the municipality needed to retain control of its data. NPS does just that, while opening the door to better citizen services, improving availability and data protection, and saving $250,000 in taxpayer dollars. Read the success story to find out more.
    Use Case 2: IT Consulting Firm Expands Business Model with NetApp Private Storage for Azure
    A Japanese IT consulting firm specializing in SAP recognized the hybrid cloud as a way to expand its service offerings and grow revenue. By choosing NetApp Private Storage for Microsoft Azure, the firm can now offer a cloud service with greater flexibility and control over data versus services that store data in the cloud.
    The new service is being rolled out first to support the development work of the firm's internal systems integration engineering teams, and will later provide SAP development and testing, and disaster recovery services for mid-market customers in financial services, retail, and pharmaceutical industries.
    Use Case 3: Financial Services Leader Partners with NetApp for Major Cloud Initiative
    In the heavily regulated financial services industry, the journey to cloud must be orchestrated to address security, data privacy, and compliance. A leading Australian company recognized that cloud would enable new business opportunities and convert capital expenditures to monthly operating costs. However, with nine million customers, the company must know exactly where its data is stored. Using native cloud storage is not an option for certain data, and regulations require that the company maintain a tertiary copy of data and retain the ability to restore data under any circumstances. The company also needed to vacate one of its disaster-recovery data centers by the end of 2014.
    To address these requirements, the company opted for NetApp Private Storage for Cloud. The firm placed NetApp storage systems in two separate locations: an Equinix cloud access facility and a Global Switch colocation facility both located in Sydney. This satisfies the requirement for three copies of critical data and allows them to take advantage of AWS EC2 compute instances as needed, with the option to use Microsoft Azure or IBM SoftLayer as an alternative to AWS without migrating data. For performance, the company extended its corporate network to the two facilities.
    The firm vacated the data center on schedule, a multimillion-dollar cost avoidance. Cloud services are being rolled out in three phases. In the first phase, NPS will provide disaster recovery for the company's 12,000 virtual desktops. In phase two, NPS will provide disaster recover for enterprise-wide applications. In the final phase, the company will move all enterprise applications to NPS and AWS. NPS gives the company a proven methodology for moving production workloads to the cloud, enabling it to offer new services faster. Because the on-premises storage is the same as the cloud storage, making application architecture changes will also be faster and easier than it would be with other options. Read the success story to learn more.
    NetApp on NetApp: nCloud
    When NetApp IT needed to provide cloud services to its internal customers, the team naturally turned to NetApp hybrid cloud solutions, with a Data Fabric joining the pieces. The result is nCloud, a self-service portal that gives NetApp employees fast access to hybrid cloud resources. nCloud is architected using NetApp Private Storage for AWS, FlexPod®, clustered Data ONTAP and other NetApp technologies. NetApp IT has documented details of its efforts to help other companies on the path to hybrid cloud. Check out the following links to lean more:
    Hybrid Cloud: Changing How We Deliver IT Services [blog and video]
    NetApp IT Approach to NetApp Private Storage and Amazon Web Services in Enterprise IT Environment [white paper]
    NetApp Reaches New Heights with Cloud [infographic]
    Cloud Decision Framework [slideshare]
    Hybrid Cloud Decision Framework [infographic]
    See other NetApp on NetApp resources.
    Data Fabric: NetApp Services for Hybrid Cloud
    As the examples in this article demonstrate, NetApp is developing solutions to help organizations of all sizes move beyond cloud silos and unlock the power of hybrid cloud. A Data Fabric enabled by NetApp helps you more easily move and manage data in and near the cloud; it's the common thread that makes the uses cases in this article possible. Read Realize the Full Potential of Cloud with the Data Fabric to learn more about the Data Fabric and the NetApp technologies that make it possible.
    Richard Treadway is responsible for NetApp Hybrid Cloud solutions including SteelStore, Cloud ONTAP, NetApp Private Storage, StorageGRID Webscale, and OnCommand Insight. He has held executive roles in marketing and engineering at KnowNow, AvantGo, and BEA Systems, where he led efforts in developing the BEA WebLogic Portal.
    Tom Shields leads the Cloud Service Provider Solution Marketing group at NetApp, working with alliance partners and open source communities to design integrated solution stacks for CSPs. Tom designed and launched the marketing elements of the storage industry's first Cloud Service Provider Partner Program—growing it to 275 partners with a portfolio of more than 400 NetApp-based services.
    Quick Links
    Tech OnTap Community
    Archive
    PDF

    Dave:
    "David Scarani" <[email protected]> wrote in message
    news:3ecfc046$[email protected]..
    >
    I was looking for some real world "Best Practices" of deploying J2EEapplications
    into a Production Weblogic Environment.
    We are new at deploying applications to J2EE application servers and arecurrently
    debating 2 methods.
    1) Store all configuration (application as well as Domain configuration)in properties
    files and Use Ant to rebuild the domain everytime the application isdeployed.
    I am just a WLS engineer, not a customer, so my opinions have in some
    regards little relative weight. However I think you'll get more mileage out
    of the fact that once you have created your config.xml, checking it into src
    control, versioning it. I would imagine that application changes are more
    frequent than server/domain configuration so it seems a little heavy weight
    to regenerate the entire configuration everytime an application is
    deployed/redeployed. Either way you should check out the wlconfig ant task.
    Cheers
    mbg
    2) Have a production domain built one time, configured as required andalways
    up and available, then use Ant to deploy only the J2EE application intothe existing,
    running production domain.
    I would be interested in hearing how people are doing this in theirproduction
    environments and any pros and cons of one way over the other.
    Thanks.
    Dave Scarani

  • Real World Adobe Photoshop CS3 (Real World)

    Real World Adobe Illustrator CS3 (Real World) - Mordy
    Golding;
    Real World Adobe Photoshop CS3 (Real World) - David Blatner;
    these books are in the UPPER LEVEL than "classroom in a book"
    series ?

    > but the part about DNG has convinced me to dive deeper in it and give it a go
    When working in a Bridge/Camera Raw/Photoshop workflow, I tend to ingest the actual native raw files, do initial selects and gross edits and basic metadata work via templates and THEN do the conversion to DNG. I'll use the DNG as my working files and the original raws as an archive. I tend to do this more with studio shoots. I tend to use Lightroom when I'm on the road.
    When working in Lightroom first, I tend to ingest and convert to DNG upon ingestion (when in the road working on a laptop) while using the backup copyusually working on a pair of external FW drives one for working DNG files and 1 for BU of the original raws. Then, when I get back to the studio I make sure I write to XMP and export the new shoot as a catalog and import into my studio copy of Lightroom. Then I'll also cache the newly imported images in Bridge as well so I can get at the image either in Bridge or Lightroom.
    It's a bit of a chore now since I do work in Camera Raw a lot (well, DOH, I had to to do the book!) but I also keep all my digital files in a Lightroom catalog which is now up to about 74K...
    Then, depending on what I'll need to do, I'll either work out of LR or Bridge/Camera Raw...
    If I'm doing a high-end final print, I generally process out of Camera Raw as a Smart Object and stack multiple layers of CR processed images...if I'm working on a batch of images I'll work out of Lightroom since the workflow seems to suit me better.
    In either event, I've found DNG to be better than native raws with sidecar files.

  • RAID test on 8-core with real world tasks gives 9% gain?

    Here are my results from testing the software RAID set up on my new (July 2009) Mac Pro. As you will see, although my 8-core (Octo) tested twice as fast as my new (March 2009) MacBook 2.4 GHz, the software RAID set up only gave me a 9% increase at best.
    Specs:
    Mac Pro 2x 2.26 GHz Quad-core Intel Xeon, 8 GB 1066 MHz DDR3, 4x 1TB 7200 Apple Drives.
    MacBook 2.4 GHz Intel Core 2 Duo, 4 GB 1067 MHz DDR3
    Both running OS X 10.5.7
    Canon Vixia HG20 HD video camera shooting in 1440 x 1080 resolution at “XP+” AVCHD format, 16:9 (wonderful camera)
    The tests. (These are close to my real world “work flow” jobs that I would have to wait on when using my G5.)
    Test A: import 5:00 of video into iMovie at 960x540 with thumbnails
    Test B: render and export with Sepia applied to MPEG-4 at 960x540 (a 140 MB file) in iMovie
    Test C: in QuickTime resize this MPEG-4 file to iPod size .m4v at 640x360 resolution
    Results:
    Control: MacBook as shipped
    Test A: 4:16 (four minutes, sixteen seconds)
    Test B: 13:28
    Test C: 4:21
    Control: Mac Pro as shipped (no RAID)
    Test A: 1:50
    Test B: 7:14
    Test C: 2:22
    Mac Pro config 1
    RAID 0 (no RAID on the boot drive, three 1TB drives striped)
    Test A: 1:44
    Test B: 7:02
    Test C: 2:23
    Mac Pro config 2
    RAID 10 (drives 1 and 2 mirrored, drives 3 and 4 mirrored, then both mirrors striped)
    Test A: 1:40
    Test B: 7:09
    Test C: 2:23
    My question: Why am I not seeing an increase in speed on these tasks? Any ideas?
    David
    Notes:
    I took this to the Apple store and they were expecting 30 to 50 per cent increase with the software RAID. They don’t know why I didn’t see it on my tests.
    I am using iMovie and QuickTime because I just got the Adobe CS4 and ran out of cash. And it is fine for my live music videos. Soon I will get Final Cut Studio.
    I set up the RAID with Disk Utility without trouble. (It crashed once but reopened and set up just fine.) If I check back it shows the RAID set up working.
    Activity Monitor reported “disk activity” peaks at about 8 MB/sec on both QuickTime and iMovie tasks. The CPU number (percent?) on QT was 470 (5 cores involved?) and iMovie was 294 (3 cores involved?).
    Console reported the same error for iMovie and QT:
    7/27/09 11:05:35 AM iMovie[1715] Error loading /Library/Audio/Plug-Ins/HAL/DVCPROHDAudio.plugin/Contents/MacOS/DVCPROHDAudio: dlopen(/Library/Audio/Plug-Ins/HAL/DVCPROHDAudio.plugin/Contents/MacOS/DVCPROHD Audio, 262): Symbol not found: _keymgr_get_per_threaddata
    Referenced from: /Library/Audio/Plug-Ins/HAL/DVCPROHDAudio.plugin/Contents/MacOS/DVCPROHDAudio
    Expected in: /usr/lib/libSystem.B.dylib

    The memory controllers, one for each cpu, means that you need at least 2 x 2GB on each bank. If that is how Apple set it up, that is minimal and the only thing I would do now with RAM is add another 2 x 2GB. That's all. And get you into triple channel bandwidth.
    It could be the make and model of your hard drives. If they are seagate then more info would help. And not all drives are equal when it comes to RAID.
    Are you new to RAID or something you've been doing? seems you had enough to build 0+1 and do some testing. Though not pleased, even if it works now, that it didn't take the one time.
    Drives - and RAIDs - improve over the first week or two - which, before commiting good data to them - is the best time to torture, run them ragged, use Speedtools to break them in, loosen up the heads, scan for media errors, and run ZoneBench (and with 1TB, partition each drive into 1/4ths).
    If Drive A is not identical to B, then they may deal with an array even worse. And no two drives are purly identical, some vary more than others, and some are best used in hardware RAID controller environments.
    Memory: buying in groups of three. okay. But then adding 4 x 4GB? So bank A with 4 x 2GB and B with twice as much memory. On Mac Pro, 4 DIMMs on a bank you get 70% bandwidth, it drops down from tri-channel to dual-channel mode.
    I studied how to build or put together a PC for over six months, but then learned more in the month (or two) after I bought all the parts, found what didn't work, learned my own short-comings, and ended up building TWO - one for testing, other for backup system. And three motherboards (the best 'rated' also had more trouble with BIOS and fans, the cheap one was great, the Intel board that reviewers didn't seem to "gork" actually has been the best and easiest to use and update BIOS). Hands on wins 3:1 versus trying to learn by reading for me, hands-on is what I need to learn. Or take car or sailboat out for drive, spin, see how it fares in rough weather.
    I buy an Apple system bare bones, stock, or less, then do all the upgrades on my own, when I can afford to, gradually over months, year.
    Each cpu needs to be fed. So they each need at least 3 x 1GB RAM. And they need raw data fed to RAM and cpu from disk drives. And your mix of programs will each behave differently. Which is why you see Barefeats test with Pro Apps, CINEBENCH, and other apps or tools.
    What did you read or do in the past that led you to think you need RAID setup, and for how it would affect performance?
    Photoshop Guides to Performance:
    http://homepage.mac.com/boots911/.Public/PhotoshopAccelerationBasics2.4W.pdf
    http://kb2.adobe.com/cps/401/kb401089.html
    http://www.macgurus.com/guides/storageaccelguide.php
    4-core vs 8-core
    http://www.barefeats.com/nehal08.html
    http://www.barefeats.com/nehal03.html

  • Character Styles in the Real World

    Rick:
    Thanks for your efforts, and let me add my Amen to both
    subjects (on file locations and on Character styles).
    My real-world use of Character styles is a combination usage
    of Paragraph and Character styles for Notes: I have a Paragraph
    style called Note, which simply adds margins of .15in Left, 10pt
    Top, and 8pt Bottom. Within this paragraph style, multiple labels
    announce the type of Note with the use of Character styles
    NoteLabel (Navy), RecommendLabel (Teal), CAUTIONLabel (Purple), and
    WARNINGLabel (Red).
    This way, you can change the color of one or more labels
    without worrying about the paragraph settings (or vice versa).
    Also, when placing a Note inside a table cell (which might
    have limited horizontal space, especially with three or four
    columns), we still use the "Label" character styles but
    without the Notes paragraph style. This still sets off the
    text visually, without adding unnecessary extra vertical space.
    Thanks again, Rick!
    Leon

    I can tell you about two sites.
    1. A system which allocates and dispatches crews, trucks, backpack hoses, spare socks, etc to bushfires (wildfires to you). It operates between two Government departments here in Australia. Each of those despatchable items is a remote object and there have been up to 50,000 active in the system at a time during the hot summer months. This is a large and life-critical system.
    2. A monitoring system for cable TV channels. A piece of hardware produces a data stream representing things like channel utilization, error rates, delay, etc and this is multiplexed via RMI to a large number of operator consoles. Again this is a major and business-critical system.
    And of course every J2EE system in existence uses RMI internally, albeit almost entirely RMI/IIOP.

  • Real World Item Level Permission Performance?

    I am considering implementing item level permission on a list we use. I've seen all the articles online cautioning not to do this with lists of more than 1000 items, but the articles seem to have little detailed information about the actual impact and what
    causes the performance issues. Additionally, they seem to refer to document libraries more than lists. I'd like some feedback about what might occur if we were to use item level security in our situation.
    Our situation is this: list of current ~700 items in a sharepoint list. Expected to grow around 700 items per year. The list has about 75 fields on it. We have 8 active-directory groups that have access to the list, based upon company department. Each
    item in the list can apply to one or more departments. The groups represent around 100-150 different unique users.
    We would like to use item level security to be set via workflow, to enable particular groups to access the item based upon their group membership. For example, if the list item is for the HR department, then the HR group has access. If the item is for IT,
    then the IT group has access (and HR wouldn't).
    That's it. There would be no nesting of items with multiple permission levels, no use of user-level ACLs on the items, etc.
    Thoughts about this configuration and expected performance issues?  Thanks for any feedback!

    Just an update for anyone who finds this thread:
    I converted our data into a test SharePoint list with 1500 rows. I then enabled full item-level security, with restrictions to hide data not created by the person.
    I then set individual permissions for each item that included 2-3 AD groups with different permissions--contribute, full ownership, etc, and 2-3 individuals with varying permissions. The individuals represented around 50 total people.
    After the permissions were set I then did a comparison of loading individual views and the full data set in Standard and Datasheet views, for both myself as an administrator with full list access and with several of the individuals who only had access to
    their designated items--typically 75-100 of the total list.
    The results were that I found no discernable difference in system performance from the user interface level while loading list views after the item level security was configured in this way. I understand this will vary based up
    hardware configuration and exact permission configuration, but in our situation the impact of item level security on a list of 1500 items had very little, if any, negative performance impact. Note that I didn't check performance at the database server level,
    but I'm assuming the impact there was minimal since the front-end user experience was unaffected.
    I expect we'll put this solution into place and if we do I'll update this post when we have additional real-world usage information.

  • Can anyone explain nonpredefine exception with real world example

    Can anyone explain nonpredefine exception with real world example

    Is this what you mean?
    i.e. When others catches all errors - so it is not predefined
    Predefined errors are things like no rows, too many rows etc - meant to catch specific errors
    It is the difference between tolerating one 'bad' thing specifically and tolerating all 'bad' things.
    DECLARE
       v_text  varchar2(1);
    BEGIN
       v_text:= 'THIS STRING IS TOO LONG FOR THIS VARIABLE';
    EXCEPTION
       WHEN OTHERS --this is non error specific and generally VERY BAD practise as it suppresses all errors
           dbms_output.put_line('Whatever you want to do when an unknown error manifests here');
    END;

  • It's Shipping -- Real World Camera Raw for CS3

    Folks,
    Just a quick ping to let you know that Real World Camera Raw with Adobe Photoshop CS3 is shipping. I got my copy today (but hey, I'm the co-author, I'm supposed to get mine early).
    :~)
    I've done a story on PhotoshopNews.com about the book shipping and a feature story about the printing of the book.
    See:
    It's Shipping--Real World Camera Raw with Adobe Photoshop CS3
    and...
    Printing RWCR CS3
    Now, back to your regularly scheduled programing....

    > but the part about DNG has convinced me to dive deeper in it and give it a go
    When working in a Bridge/Camera Raw/Photoshop workflow, I tend to ingest the actual native raw files, do initial selects and gross edits and basic metadata work via templates and THEN do the conversion to DNG. I'll use the DNG as my working files and the original raws as an archive. I tend to do this more with studio shoots. I tend to use Lightroom when I'm on the road.
    When working in Lightroom first, I tend to ingest and convert to DNG upon ingestion (when in the road working on a laptop) while using the backup copyusually working on a pair of external FW drives one for working DNG files and 1 for BU of the original raws. Then, when I get back to the studio I make sure I write to XMP and export the new shoot as a catalog and import into my studio copy of Lightroom. Then I'll also cache the newly imported images in Bridge as well so I can get at the image either in Bridge or Lightroom.
    It's a bit of a chore now since I do work in Camera Raw a lot (well, DOH, I had to to do the book!) but I also keep all my digital files in a Lightroom catalog which is now up to about 74K...
    Then, depending on what I'll need to do, I'll either work out of LR or Bridge/Camera Raw...
    If I'm doing a high-end final print, I generally process out of Camera Raw as a Smart Object and stack multiple layers of CR processed images...if I'm working on a batch of images I'll work out of Lightroom since the workflow seems to suit me better.
    In either event, I've found DNG to be better than native raws with sidecar files.

  • Grids in the real world

    Hi....sort of a cryptic subject, but here's what I'm trying to do:
    What is the easy way (not too much thinking) to use a grid that will help me with the size and placement of two objects?
    Let's say I have a image taken with a digital camera of an interior house wall. The real size of the wall is 19' x 12'. My plan is to size and print this "wall" image onto an 8 x 11.5 paper. But before printing, I want to paste into the "wall" image two rectangles that are 24" x 36"-----intended to be pictures hanging on the wall.
    The confusion I have is how to make the relationship in real size (inches and feet) work for the size, placement, and printing. For all objects and all with a grid that would also represent a scale like 1"=1'.
    Any ideas? Hoping to hear from you.
    Mike

    Hi Mike,
    Get out your calculator, :-)
    You should have the wall width dimension (19') as the images longer
    dimension (landscape).
    You may need to Crop the image to fit the wall width and hight to get
    the proper dimensions. You can use "Image" > "transform" > "Perspective"
    to square the wall on the image.
    Divide the wall pixel dimension by the real world dimension in feet.
    The answer is what you want to set the "Gridline every [answer] pixels"
    dimension to. If you set the "Subdivisions [12] you will have a line
    every inch.
    HTH,
    Alex B.,

  • RMI Use in the real world

    I present an RMI module in the Java course I teach. I know enough about RMI to be able to talk about it, and write a simple classroom example, but I have never done RMI in the real world. Can anyone offer an example of what kind of applications are being developed that use RMI?
    Thanks,
    J.D.

    I can tell you about two sites.
    1. A system which allocates and dispatches crews, trucks, backpack hoses, spare socks, etc to bushfires (wildfires to you). It operates between two Government departments here in Australia. Each of those despatchable items is a remote object and there have been up to 50,000 active in the system at a time during the hot summer months. This is a large and life-critical system.
    2. A monitoring system for cable TV channels. A piece of hardware produces a data stream representing things like channel utilization, error rates, delay, etc and this is multiplexed via RMI to a large number of operator consoles. Again this is a major and business-critical system.
    And of course every J2EE system in existence uses RMI internally, albeit almost entirely RMI/IIOP.

  • Error occurred while converting the file...

    So, it seems like every CD that I tried importing over the last couple of days at some point caused the software to stop and go:
    Error occurred while converting the file "[Track name here]". An unknown error occurred (-50).
    Has a solution been found for this? Have others experienced this problem? How do I fix this?

    Importing problems have been rife in the recent iTunes release.
    Not sure how to solve your problem, but until a solution comes along feel free to rip in another program, such as Windows Media Player, and add the resulting MP3s to your iTunes library.

  • 'Error occured while converting the file--unknown error occured 0x77686174

    I just got my first iPod but I'm a little frustrated. I put in a cd, iTunes opens and asks me if I want to import the songs, I reply yes. When it tries the very first song of the cd, it immediately comes up with 'Error occured while converting the file "name of song". An unknown error occured (0x77686174).' It doesn't matter if this is a completely new cd or an older one.
    I ran CD diagnostics on iTunes and my results are below. I have posted this several times on many different boards and haven't found any solutions or been given any solutions from anyone. I have an iPod that I can't use because I can't download any songs...does anyone have any idea what to do?
    I've looked at Dell to update drivers for my cd drive but it says that I have the newest drivers...the computer isn't even a year old yet. What do I do?? I've wiped the cd like several posts say to do, I've clicked the fix error option on cd off and on like several posts have said to do, I've changed the conversion of 128 to 64 like several posts have said to do...help please, anyone?
    Microsoft Windows XP Home Edition Service Pack 2 (Build 2600)
    Dell Inc. Dell DM051
    iTunes 7.3.1.3
    CD Driver 2.0.6.1
    CD Driver DLL 2.0.6.2
    LowerFilters: PxHelp20 (2.0.0.0), DRVMCDB (1.0.0.1), DLACDBHM (1.0.0.1),
    UpperFilters: GEARAspiWDM (2.0.6.1),
    Current user is an administrator.
    Video Display Information:
    RADEON X300 SE 128MB HyperMemory
    RADEON X300 SE 128MB HyperMemory Secondary
    Connected Device Information:
    Disk drive, ST3160812AS, Bus Type ATA, Bus Address [0,0]
    Disk drive, TEAC USB HS-CF Card USB Device, Bus Type USB
    Disk drive, TEAC USB HS-MS Card USB Device, Bus Type USB
    Disk drive, TEAC USB HS-SD Card USB Device, Bus Type USB
    Disk drive, TEAC USB HS-xD/SM USB Device, Bus Type USB
    CD-ROM Drive, TSSTcorp CDRWDVD TS-H492C, Bus Type ATA, Bus Address [0,0]
    If you have multiple drives on the same IDE or SCSI bus, these drives may interfere with each other.
    Some computers need an update to the ATA or IDE bus driver, or Intel chipset. If iTunes has problems recognizing CDs or hanging or crashing while importing or burning CDs, check the support site for the manufacturer of your computer or motherboard.
    D: TSSTcorp CDRWDVD TS-H492C, Rev DE02
    Audio CD in drive.
    Found 12 songs on CD, playing time 54:35 on Audio CD.
    Track 1, start time 00:02:00
    Track 2, start time 03:30:32
    Track 3, start time 09:38:25
    Track 4, start time 14:58:53
    Track 5, start time 18:01:08
    Track 6, start time 21:24:57
    Track 7, start time 26:37:43
    Track 8, start time 32:16:51
    Track 9, start time 36:17:54
    Track 10, start time 41:57:36
    Track 11, start time 46:21:38
    Track 12, start time 51:48:24
    Audio CD reading succeeded.
    Get drive speed succeeded.
    The drive CDR speeds are: 48.
    The drive CDRW speeds are: 48.

    I came across your error code in iLounge, this post might help you out: http://forums.ipodlounge.com/showthread.php?threadid=30496

  • Error Occurred While Converting the file"name of song" The file name.......

    I get this message when I try to import some CD's to my library.
    A window pops up and says
    Error occurred while converting the file "name of song"
    The file name was invalid or to long.
    Anyone else ever get this message, and how can you import CD's???????

    I have had this exact same problem. I'm not sure how to fix it...

  • Error occured while converting the file . . . . . .

    While trying to import a music CD into iTunes, I get the following message:
    Error occured while converting the file "Handel: Messiah, HWV 56 - Sinfony". You do not have the privelege to make changes.
    The song in question is: Handel: Messiah, HWV 56 - Sinfony.
    I've imported a number of CDs into iTunes and have never seen this before.

    Hi, LongIsland14. 
    Thank you for visiting Apple Support Communities.
    This issue would be related to a permission issue most likely with the iTunes Media folder.  Here are a couple articles that will help you troubleshoot this issue.
    Trouble adding music to iTunes library or importing audio CD
    http://support.apple.com/kb/ts1387
    iTunes: Missing folder or incorrect permissions may prevent authorization
    http://support.apple.com/kb/ts1277
    Cheers,
    Jason H.

Maybe you are looking for

  • Unable to change the LOV of To locator in Subinventory transfer form

    I am trying to change the LOV of To Locator field in the Subinventory form. I have made the trigger event as "When-new-form-instance" In actions tab: Built in : Create a record group from query and gave a name to the record group Property: LOV - ENAB

  • CS5.5 Design Premium Installation Help

    I have downloaded CS5.5 Design Premium through a download link given to me by an Adobe live-chat officer. I wish to re-install it (after losing all the programs from my Mac OS X 10.9). I do not have any installation DVD. When I double-click 'Install'

  • Folders created in bookmarks no longer show up when I right a webpage to save.

    folders that where made with right click crate new folder no longer show in that particular menu. they appear under book marks at the top of the page but no longer appear when I right click a web page and wish to save to a folder specifically created

  • Database design problem for multiple language application

    Hi All, We are working on a travelling portal and for a traveling portal its content and details are the heart.we are planning to have it in multiple locale so this means we need to handle the dynamic data for each locale. currently we have following

  • Coming up empty

    I am trying to re-link/ re-build a project in iMovie 11 that was created by iMovie in Sept 09. It was created on an older iMac and hasn't re-opened when transferred to the new one. I have the camera data and the project files and presumably all of th