GoldenGate for Big Data 12c for Win x64?

I was looking for the GoldenGate for Big Data download for Win x64 and all I found on edelivery was Linux, Solaris, HP-UX and AIX platforms, but no Windows at all (see the screenshot below). I wonder if it's been released yet? Or, is it just an unfortunate omission?
Thanks
Andy

Thanks, for your reply, Karan!
I tried following your advice, but bumped into yet another similar problem. I've installed OGG 12c and now I can't seem to be able to find the matching version of the GG Application Adapters for JMS and Flat File for the Win x64 platform. The latest version of Application Adapter available on edelivery is 11.1.1.0.0 which means I need to downgrade OGG to the same version. No big deal but I wanted to make sure I'm not missing anything.
I wonder if anybody has any idea as to whether Application Adapters 12c for JMS and Flat File is available for the Win x64 platform, and if so, where can I download it from?
Thanks
Andy

Similar Messages

  • What is the best big data solution for interactive queries of rows with up?

    0 down vote favorite
    We have a simple table such as follows:
    | Name | Attribute1 | Attribute2 | Attribute3 | ... | Attribute200 |
    | Name1 | Value1 | Value2 | null | ... | Value3 |
    | Name2 | null | Value4 | null | ... | Value5 |
    | Name3 | Value6 | null | Value7 | ... | null |
    | ... |
    But there could be up to hundreds of millions of rows/names. The data will be populated every hour or so.
    The goal is to get results for interactive queries on the data within a couple of seconds.
    Most queries look like:
    select count(*) from table
    where Attribute1 = Value1 and Attribute3 = Value3 and Attribute113 = Value113;
    The where clause contains arbitrary number of attribute name-value pairs.
    I'm new in big data and wondering what the best option is in terms of data store (MySQL, HBase, Cassandra, etc) and processing engine (Hadoop, Drill, Storm, etc) for interactive queries like above.

    Hi,
    As always, the correct answer is "it depends".
    - Will there be more reads (queries) or writes (INSERTs)?
    - Will there be any UPDATEs?
    - Does the use case require any of the ACID guarantees, or would "eventual consistency" be fine?
    At first glance, Hadoop (HDFS + MapReduce) doesn't look like a viable option, since you require "interactive queries". Also, if you require any level of ACID guarantees or UPDATE capabilities the best (and arguably only) solution is a RDBMS. Also, keep in mind that Millions of rows is pocket change for modern RDBMSs on average hardware.
    On the other hand, if there'll be a lot more queries than inserts, VERY few or no updates at all, and eventual consistency will not be a problem, I'd probably recommend you to test a Key-Value store (such as Oracle NoSQL Database). The idea would be to use (AttributeX,ValueY) as the Key, and a Sorted List of Names that have ValueY for their AttributeX. This way you only do as many reads as attributes you have in the WHERE clause, and then compute the intersection (very easy and fast with sorted lists).
    Also, I'd do this computation manually. SQL may be comfortable, but I don't think It's Big Data ready yet (unless you chose the RDBMS way, of course).
    I hope it helped,
    Joan
    Edited by: JPuig on Apr 23, 2013 1:45 AM

  • Strategy for big data

    Dear experts,
    Currently i'm facing Big Data problem. We have an about 1TB transaction record for Per Month.
    Now I'm trying to create Data Marts for that. And Install Obiee. What is the Strategy And Steps?
    Please Advice...
    BR,
    Eba

    Denis,
    In this case you can do it two ways.
    1. Proxies - You will have to develop a custom report which will collect all the data that needs to be sent and call the PROXY will the collected as input.
    2. IDOCs - If you are dealing with standard IDOCS, this is easier. You can activate the configuration to send the IDOCS for contracts for all the operations that you have mentioned. Do the required outbound configuration in WE20 to mention the target system as XI.
    I am not sure why are you even thinking of scheduling a BPM in XI that will invoke the RFC. SAP as such has got the scheduling capabilities. I would rather suggest you to use that.
    Regards,
    Ravi

  • Working with R packages for Big Data

    Hi ,
    I wonder which R package from it big data an parallel processing family are relevent to work with in ML Studio ?
    It depends on if ML Studio uses Map Reduce during R script ? If yes , RHadoop package seems not useful .
    If using snowfall package for parallel processing will help for high volume datasets . If it will exploit several CPU ?
    Thanks in advance

    Currently, the R scripts are executed on single VM. You can manually set up map-reduce pattern by splitting the data and having multiple Execute R Script modules in parallel in your experiment graph.
    -Roope

  • Drill down report for due date analysis for customer open items

    Hi, in transaction FDI0 i am using report 0SAPDUEAN-01 Due Date Analysis for Open Items .The reason i am using this is that s_alr_87012178 caters for only 6 intervals. With this report , i get 8 intervals:
    daily intervals Due Not Due Total OI
    0 - 30 0,00 0,00 0,00
    31 - 60 67.000,00- 0,00 67.000,00-
    61 - 90 0,00 0,00 0,00
    91 - 120 20.020,86 0,00 20.020,86
    121 - 150 3.270,00 0,00 3.270,00
    151 - 180 0,00 0,00 0,00
    181 - 210 0,00 0,00 0,00
    211 - 99999 0,00 0,00 0,00
    Total open items 43.709,14- 0,00 43.709,14-
    Is it possible to change the intervals through custo? i need intervals:
    0 - 30
    31 - 60
    91-120
    121-150
    151-365
    >365

    Hi AA
    refer this link where in I have given the screen shots
    http://img233.imageshack.us/g/86081486.jpg/
    The 1st screen shot is COPY of the Std Form
    The 2nd scren shot shows how to add new interval
    Br, Ajay M

  • Creating process for multiple Date fields for update or insert in APEX

    hello there,
    could someone please help me?
    i have a form on Apex based on view that is based on three tables and updating and inserting ok using trigger instead of.
    i have a problem now as in my form i have around 75 fileds (items) incuding 30 or more date fields which could be populated or left blank or update later.
    so for each date field i have two boxs; one for date, input as dd/mm/yyyy (text field) and second for time, input as 23:45. All dates will be insert or update manually by user. so as i mentioned not all date fields could be poulated at one stage.
    so i have created some process and validations and all of them work fine but i came accross if date left blank then (:) giving me problem so i have done following further process for each date field. In real table all the date fields have data type date.
    declare
    v_my_var date; -- for first date field
    str_dy VARCHAR2(10);
    dt_indx date;
    str_tm VARCHAR2(20);
    tm_indx date;
    begin
    str_dy := :p4_first_date
    str_tm := str_dy||' '||substr(:p8_first_date_hh,1,2)||':'||substr(:p8_first_date_HH,4,2);
    dt_indx := to_date(str_tm,'DD/MM/YYYY HH24:MI');
    IF str_dy is not null then
    v_my_var :=dt_indx;
    ELSE
    v_my_var := NULL;
    END IF;
    update table 1 set my_date = v_my_var where d_id= :p4_d_id;
    end;
    above code work fine but one date field of course therefore i have to do same code for each date field with changes and initialise variable again and again for each field.
    so i like to ask is there any easy way that is more professional. i was thinking about the procedure and using collection or similar but honestly not much experience on that so could some one please help me?
    I will be very thankful.
    KRgds

    Hi,
    You can do the needful by re-using the code if you can give the item names as P8_DATE1, P8_DATE_hh1, P8_DATE2, P8_DATEhh2 etc..So your item name just differs by a sequence.
    Now you write function which will return desired date value taking above items as input. Pass item names to this function, get session state using APEX_UTIL.GET_SESSION_STATE('item_name') API.
    Now modify you code as
    FOR i IN 1..30
    LOOP
    v_date_array[i] = f_get_date('P8_DATE'||i, 'P8_DATEhh'||i);
    END LOOP;
    ....Now you have all date valus in array. Just write one update as follows
    UPDATE  TABLE1
    SET date1 = my_date_array[1], date2 = my_date_array[2]..
    WHERE ....Hope it helps :)
    Cheers,
    Hari

  • Should we really go for bean data controls for a new project?

    Hi,
    I am still new data controls and trying to figure out the advantages of using bean data controls for our new project. Our UI is going to have customized UI components and our back end is going to be a tcp/ip server.
    Is it a good idea to develop java beans and then create data controls to bind to UI layer? I think it makes sense to use data controls if we want to use existing java beans. Maybe we would be separating the model layer by using data controls, but only thing it would be doing for us would be the simple object calls to my java beans. Would it be better to use data controls or use I choose to make object calls?
    Thanks,
    Manoj

    Hi,
    the POJO data control will always give you a benefit and develope productivity, unless what you have to build fits on a single page - in which case you may not mind the burdon of manual UI component binding
    Frank

  • User exit for additional data B for sale order item .

    Hi., all
    my client requirement is
    (  This business requirement will make the Last Price for a given item be visible during order entry.  )
    u2022Retrieve & display during order entry, the most recent unit price given to a customer for a specific item, from the Billing data.
    . Display the Last Price under Additional Data B Screen
    add new field (last extended price) in additional data b screen.
    after that 1.     Using the Sales Order Material Number (VBAP-MATNR), Sales Order Sales Organization (VBAK-VKORG), Sales Order Distribution Channel (VBAK-VTWEG), Sales Order Division (VBAK-SPART), Sales Order Sold-to Number (VBPA-KUNNR for VBPA-PARVW=u2019AGu2019) to access the Billing Items By Material Index Table (VRPMA) and specify a billing date (VRPMA-FKDAT) of less than 60 days from current Sales Order requested delivery date (if specified at header VBAK-VDATU or at the schedule line level (VBEP-EDATU).  This will result in all the billing documents where the Sold-to bought the item but isnu2019t completely refined as of yet.  Retain the billing document (VRPMA-VBELN), item (VBPMA-POSNR), and billing date (VRPMA-FKDAT) in a temporary table to pass to number 2 as the input.
    2.     Use the billing document (VRPMA-VBELN) and item (VRPMA-POSNR) to read the Sales Document Partners Table (VBPA) where the partner function (VBPA-PARVW = u201CSHu201D) and the Sales Order Ship-To (VBPA-KUNNR for VBPA-PARVW=u2019WEu2019) to select ONLY billing documents that are for that given ship-to location.  This filters out only billing documents relevant for that ship-to location. 
    3.     From the resulting list of billing documents, select the most recent date (VRPMA-FKDAT) which will refine the search for the last Billing Document (VRPMA-VBELN) and item (VRPMA-POSNR).
    4.     Using the most recent Billing Document (VBPA-VBELN), access the Billing Document Item Table (VBRP). To result in the Last Extended Price as VBRP-KZWI1.
    5.     This price will be an extended price which needs to be calculated as a u2018unit priceu2019.  For this billing item, select the sales unit (VBRP-VRKME) to determine if the sales unit is in cases or eaches. 
    a.     If the unit of measure is in cases, then simple math is required to divide the Last Extended Price (VBRP-KZWI1) by the billing quantity (VBRP-FKIMG).  Standard rounding should apply when .005 results in a .01.
    how to achive this ?

    Hi Chakravarthy,
    use the Exits provided in SAPMV45A -includes MV45*ZZ and screen exits as well 8309 8310 8459, 8460. Just be sure to
    use zznnnnnn include in the SAP provided forms instead of coding directly in the forms.
    You can check below user exits:
    MV45ATZZ :For entering metadata for sales document processing. User-specific metadata must start with "ZZ".
    MV45AOZZ:
    For entering additional installation-specific modules for sales document processing which are called up by the screen and run under PBO (Process Before Output) prior to output of the screen. The modules must start with "ZZ".
    MV45AIZZ:
    For entering additional installation-specific modules for sales document processing. These are called up by the screen and run under PAI (Process After Input) after data input (for example, data validation). The modules must start with "ZZ".
    MV45AFZZ and MV45EFZ1:
    For entering installation-specific FORM routines and for using user exits, which may be required and can be used if necessary. These program components are called up by the modules in MV45AOZZ or MV45AIZZ.
    Reddy

  • Saving big data efficiently for processing?

    Every 15 minutes we read 250 XML files. Each XML file is an
    element. Each element (xml file) is composed of 5
    sub-elements. Each sub-element has
    400 counters. So every xml file has 2,000 scores. Since there are 250 xml files, then there are a total of 500K counters.
    Data can look like this. This is one XML file, there are 249 more like this:
    ELEM1
    - ELEM1_1
    - Counter1: 54
    - Counter2: 12
    - Counter3: 6
    - Counter400: 9
    - ELEM1_2
    - Counter1: 43
    - Counter2: 65
    - Counter3: 98
    - Counter400: 12
    - ELEM1_3
    - Counter1: 43
    - Counter2: 23
    - Counter3: 64
    - Counter400: 1
    - ELEM1_4
    - Counter1: 4
    - Counter2: 2
    - Counter3: 8
    - Counter400: 12
    - ELEM1_5
    - Counter1: 43
    - Counter2: 98
    - Counter3: 2
    - Counter400: 12
    The first, most common, thought was to create a table with the columns being the counter names. But this was done with similar data, and performance was sub-par, to say the least.
    So my question is, what would be the best way to store all this scores in a database?
    Thanks.
    VM

    Classic design conundrum/s when working with XML.
    Where to perform my XML shredding in the Application or Database?
    Should I store the data in native XML format or shred it out into relational
    form?
    The database engine is a RDBMS and is designed for working with relational data. This should give you a steer as to how you might wish to store your data.
    If you need to query the contents of the XML fragments, then you're probably better off shredding them out into relational data structures but it really depends on your specific use case.
    There are optimizations that can be done to improve the performance of querying the XML data types via various Indexes but accessing the same information via a relational structure is almost always faster than XML processing, XQUERY, XPATH etc.
    If you are just wanting a document storage system for XML, then an RDBMS may not be the most suitable technology to use.
    Can you expand upon your comment, "performance was sub-par". Specifically what was not performing as required?
    John Sansom | SQL Server MCM
    Blog |
    Twitter | LinkedIn |
    SQL Consulting

  • Startup guide for oracle data mining for anomaly detection

    hi
    well i have setup oralce 10g for data mining. ihae also downlaoded and nstalled demo prog.
    now im wondering how to start to develop my own model.... basically my idea is to use anomaly detection tecnuique for network traffic.
    i want ot scann network packets and mine them for anomaly. do i have to create profiles for that and if yes how?????
    A BIG DILEMMA... ANY ONE CAN PLS GUIDE, ILL APPRECIATE
    CHEERS
    ninja

    Ninja,
    You may also want to take a look at this thread in the forum:
    Re: Some Questions regarding Apriori algorithm and anomaly detection
    It has some discussion that might help.
    -Marcos

  • I'm finding Ios7 exceeding frustrating having to spend hours reinstalling and looking for last data. For example I cannot delete photos I have synced, applications are missing. Where can I get clear instructions to fix the mess?

    Where can I get clear instructions to clean up a mess that Ios7 has made of my data and info on my iPad. I can't delete photos, apps are missing, it's crazy and so frustrating. I have spent hours on this.

    How did the photos get onto your iPad ? If they were synced from a computer then they can't be deleted directly in the iPad's Photos app - instead they are deleted by not including them in your next photo sync from your computer's iTunes.
    For your apps, as long as they are still in your country's store then you should be able to redownload them via the Purchased tab in the App Store app on your iPad, and via the Purchased link under Quicklinks on the right-hand side of the iTunes store home page on your computer's iTunes. If you had backed up to your computer's iTunes before starting the update then you could redownload them to your computer's iTunes and then retry restoring to your backup and see if that copies the apps and their content back (the actual apps aren't included in a backup, just their content/settings, so for a restore to work fully you need to have the relevant apps in the Apps part of your library).

  • Fix for excessive data usage for "mapping services"?

    I'm experiencing a problem where my iPhone 4S running iOS 7.1.2 (though I saw this problem with 7.1.1 as well) uses huge amounts of cellular data (500 MB+) while sitting idle. Looking at usage in the Cellular section of settings reveals that the data is being used for "mapping services" under system services. I've turned off location services for all but a very few apps that truly require it. I've also turned off background refresh for all apps, cellular data for most apps and system services, as well as turning off the frequent locations feature. Unfortunately, even after making these changes, the problem occurs. Some searching does reveal a few other people reporting this (e.g. here). Unfortunately, I haven't found any reports of working solutions.
    I'm traveling in Africa, and using relatively expensive prepaid data, so this problem is actually costing me real money, having used up 2 GB (~$60). If anyone has seen this and discovered a real fix, I would love to hear about it!

    I had the same problem. Mapping Services used 1G of data over night while I was sleeping. My location services are turned off and anything else that could use mapping services. I reported the problem to Vodacom (Network service provider in South Africa) and they told me they had numerous complaints regarding this and they are going to refund my money, BUT they cannot tell me how to solve the problem going forward. Vodacom advised that Apple are upgrading their maps from Google Maps to their own iMaps application and that is why the data is being depleted in the background. This makes no sense to me though. How can they take data from customers to upgrade their systems. Very frustrating.
    My next step is to go into the apple store and see if they can do anything. If I happen to find a solution from apple I will be sure to let you know!

  • Maintain number range interval for master data upload for existing employee

    Hi  Experts,
    1)I have the scenario to upload additional data for existing employees in PA. Employees are already existed in SAP HR , But additional Infotypes are required to maintain for those emploees
    2) I have the senario where i have to upload master data for new employees.
    Pls give the detailed description of how to maintain number range interval i.e. External or Internal for upload in both the above scenario. Do we have to maintain number range manually in master data record & then upload it Through BDC or LSMW?
    << Moderator message - Everyone's problem is important. But the answers in the forum are provided by volunteers. Please do not ask for help quickly. >>
    Edited by: Rob Burbank on Jan 12, 2011 3:49 PM

    >
    s c patil wrote:
    > 2) For new employees i have to maintain desired ( my or client?) number range in SAP system as External number range & then default that number range in NUMKAR & then maintain those number range in master data record & then get the data template filled by client & then upload the data & after that create new number range which is next to existing External number range as an  Internal number range. & then default that Internal number range.
    >
    > Pls reply ASAP
    Yes Mr. Patil...
    For existing employees
    you need to execute HIRING ACTION Through BDC with External number range. While recording you have to use atleast three infotype i.e. IT0000,IT0001,IT0002. In addition you can upload other infotype through PA30.
    For new employees
    While configuration you can create another number range as internal  for new hiring. and use NUMKR feature as well.
    Here I don't understand that why r u looking for upload process for new hiring, if it is not mass hiring. It should be day to day activities which would be done by user through PA40.
    Best Regards,
    Anand Singh

  • Are there requirements for deterministic data messaging for distributed LabVIEW applications across an Ethernet/IP network?

    Our company, RTI, just entered the Alliance program at NI recently. We are curious if power users of LabVIEW have requirements for running distributed, networked multiple LabVIEW applications in a real-time, deterministic manner? We are not attempting to make this a sales call, but if we should move forward with development of VI's that would allow distributed LabVIEW applications. We have already scoped out the effort but would desire guidance on VI layout, descriptions and ways to give a "LabVIEW" experience and not a disjointed VI that looks unprofessional. We are not sure if demand exists for this requirement. We would like to talk with
    and investigate what LabVIEW users want. We have attached a brochure on this real-time data messaging technology for your review. Maybe there is no need and we apologize for taking up NI Developer Zone bandwidth and your valuable time. We look forward to your responses.
    Attachments:
    NDDSProductBrief3-022.pdf ‏299 KB

    We are developing applications which have a distributed layout. E. g. we have 2 or more test places which have no user interface and 1 place where all configuration and monitoring and data storage takes place. We use TCP/IP for the inter computer communication. For this purpose we have developed a library which has a similar interface and usage as the queue VIs from LV 6.0.2. We support automatic reconnect on TCP/IP level and an acknowledged transfer so the application programmer can choose to do some temporary storage on the local machine if the server part is not running for sometime. We have not seen any market for that so we do not plan to sell it.
    From a lecture at the NI VIP days in Munich in 2001 personal from Siemens A&D stated to
    get a real time like behaviour over ethernet with TCP/IP you have to take care that the load will not exceed 30% of the bandwidth.
    For more information you can contact me at [email protected]
    Waldemar
    Waldemar
    Using 7.1.1, 8.5.1, 8.6.1, 2009 on XP and RT
    Don't forget to give Kudos to good answers and/or questions

  • Devolped an ALV report for daily cash receipts for selected date range

    hi,   
                 how to devlop an ALV report for daily cash receipts for selected date range.for this report what are the tables and fields we have to use.what is the selectionscreen&what is logic.give me sample report.

    hi,   
                 how to devlop an ALV report for daily cash receipts for selected date range.for this report what are the tables and fields we have to use.what is the selectionscreen&what is logic.give me sample report.

Maybe you are looking for

  • Function module to get the 'first day of next month'

    Hi I have a selection screen with  input fields 1. period (month eg: 07) 2.year(fiscal year eg: 2008 ) If user enters 07 as month and 2008 as year, then I have to display 08/01/2008(MM/DD/YYYY) as output. Requirement is to calculate the  'first day o

  • Mac keeps losing wireless network

    This has been happening on and off now for a few weeks but has got much worse recently. My mac suddenly doesn't recognise my BT broadband network name. It often takes up to half an hour before it suddenly comes back on again. Any suggestions? I'd be

  • Upgrade Photoshop elements 9 to 12

    I can' t find any information about upgrading if you have already photoshop. Is is possible, I read but how? I have an Imac

  • Exhcange 2003 on a 2003 server to exchange 2013 on a 2012 server - problems...

    Hi I removed my 2003 exhcange server physically and installed a 2012 server with exchange 2013 on it. How do I get it to work? Where do I manage the exh 2013 server, through web interface only? It installed, and worked for a few days (after i first h

  • MSI package for Oracle Calendar Client

    It would make the deployment/upgrades of the Oracle calendar much easier if it was distributed as a microsoft installer package .msi file so that I could install it automatically on my windows clients by group policy. The former product "Steltor/Corp