Oracle Big Data Lite VM

Hi Thr,
I have been trying to download Oracle Big data lite VM 4 from the following link.
Oracle Big Data Lite Virtual Machine
The issue is that after I download the zip files and try to extract using 7zip, it gives me error that unable to extract "unspecified error" .The size of the file is 2 GB. I tried from 2 laptops both got the same issue.Already have downloaded the file 3 times with no success.
Any suggestions for the same?
Thx
Vivek

Got it Fixed. Had to rename each file as .zip.001 rather than 7z.001 and then extract 001 alone in the same folder.
Got the information from
virtual machine - Oracle Big Data Lite 2.5 VM Install - Stack Overflow
Thx
Vivek

Similar Messages

  • Big Data Lite ver 3.0 cannot launch movieplex demo

    I downloaded the big data lite ver 3.0 virtual machine. Successfully imported -->  .ova and login using oracle into linux KDE.
    I clicked on firefox "Start Here". Then clicked "http://localhost:7001/movieplex/index.jsp" . Immediately I got the firefox error
    Firefox cant' establish a connection to the server at localhost:7001.
    I googled and not much information was found. I am not sure if there's any java error in setDomain.Env.sh where some posts indicated. I followed and modified but still couldn't get the website up. I am not into the labs exercise yet. Just trying to attempt to run demo and already hit into an error.
    Can anyone help please?

    Did anyone get this working for Big Data Lite VM?
    I was getting the Jackson NoSuchMethod... error, but I was able (I think) to get aroiund it by downloading newer versions of the Jackson jars.
    After I click Sign In all I get is the text: oracle.kv.impl.api.KVStoreImpl@<a_value_which_changes_every_clikc>
    Should I just give up an load 2.5? Is it more stable?
    How deep do the errors go? This is the third thing I have had to fix and I am wasting valuable time spinning my wheels.
    Thanks,
    matt

  • Best practices for administering Oracle Big Data Appliance

    -        Best practices as part of administration of Oracle Big Data Infrastructure
    -        How do we lock down max space usage per project
    Eg: Project team A can have a max limit of 10 TB space allocated
    -        Restricting roles, access ( Read, Write), place holder for common shared artifacts
    -        Template/procedure for code migration across dev,qa and prod environments etc

    Your data is bigger than I run, but what I have done in the past is to restrict their accounts to a separate datafile and limit its size to the max that I want for them to use: create objects restricted to accommodate the location.

  • Scripts for OTN Developer Day - Big Data

    Hi,
    I was wondering if anybody knows where to find the scripts used for "OTN Developer Day - Big Data" which was held during February/March 2014?
    The "Lab Guide" can be found in: https://www.oracle.com/webfolder/s/delivery_production/docs/FY14h1/BigDataWorkshop.pdf
    Regards,
    Babak.

    Not sure if these are the exact same ones, but it should be very close: Oracle Big Data Lite Virtual Machine
    It will also get you to the page where the VM is frequently updated and the HOL sections come with the updated VM.
    JP

  • OEID Big Data Connector Integration

    In the oracle big data solution , it shows OEID can source from Oracle Big Data connectors to ingest Big Data for exploration and discovery.
    Is there any documentation around as to how to achieve this ?
    Can anyone share any information ?

    It depends on the connector. Oracle Loader for Hadoop runs on the Hadoop cluster, and does not reside on the Oracle Database. It can reside on a node from where you submit your map reduce nodes - which might or might be a node on the cluster. Oracle Direct Connector for HDFS resides on Oracle Database.

  • Connection with Hadoop/big data

    Hi,
    how can we connect the HADOOP with the ENDECA STUDIO. Any document please do let me know.
    thanks and regards
    Shashank Nikam

    Hi,
    As far as I know you will need to use Oracle Data Integrator (ODI) or Oracle Big Data connectors
    Some sites:
    Oracle Data Integrator Enterprise Edition 12c | Data Integration | Oracle
    http://docs.oracle.com/cd/E37231_01/doc.20/e36963/concepts.htm#BIGUG107
    http://www.oracle.com/us/products/database/big-data-connectors/overview/index.html

  • Oracle Business Intelligence with big data

    Has anyone implemented OBIEE 10g utilizing a denormalized data model for a very large transactional data set? There is a potential to generate reports in the 100s of millions of rows, with the data warehouse storing fact tables that have ~1.5 billion rows with a data consumption rate of over 200GB per month.
    Does anyone have any best practices, tips, or advice to determine the feasibility of implementing OBIEE 10g for such a large volume? The data is transactional and there are no current requirements for aggregate data sets. Is it feasible to use OBIEE Answers to generate these types of reports? Thus far I've seen OBIEE Answers hanging/crashing on reports that are > 10MB in size. Any configuration tips or feedback would be appreciated.
    Thanks,
    John

    I think with Big Data environment you need not worry about caching, runtime aggregation and processing, if your configuration is right. The hardware along with distributed processing would take care of most of the load since Big Data database are designed to be in-memory and highly responsive.
    The thing you should consider that final output should not be too large. What I mean is, your request can process 100 million rows on database, but final output can be 1000 rows, since BI is all about summarization.
    If you have large data thrown to the Presentation Server, it could make presentation service unstable.

  • Best strategy to upload a big data file and then to insert in db the content

    Hi,
    Here's our requirement. We have a web business application developed on JSF 1.2, JE66, WebLogic for application server, and Oracle for the data back end tier. We need to upload big data files (80 to 100 Mb) from a web page, and persist the content in database tables.
    What's the best way in terms of performance to implement this use case ? Once the file is uploaded on server a command button is available on the web page  to trigger a JSF controller action in order to save data in database.
    Actually we plan to keep in memory the content of the http request, and call insert line on each line of the file. But i think it's bad and not scalable.
    Is is better to write the file on server's disk and then use multiple threads to send the lines to the database ? How to use multi threading in a JSF managed bean ?
    Thanks

    In addition, LoadFromFile is overloaded to handle both BLOB and CLOB:
    PROCEDURE LOADFROMFILE
    Argument Name                  Type                    In/Out Default?
    DEST_LOB                       BLOB                    IN/OUT
    SRC_LOB                        BINARY FILE LOB         IN
    AMOUNT                         NUMBER(38)              IN
    DEST_OFFSET                    NUMBER(38)              IN     DEFAULT
    SRC_OFFSET                     NUMBER(38)              IN     DEFAULT
    <BR>
    PROCEDURE LOADFROMFILE
    Argument Name                  Type                    In/Out Default?
    DEST_LOB                       CLOB                    IN/OUT
    SRC_LOB                        BINARY FILE LOB         IN
    AMOUNT                         NUMBER(38)              IN
    DEST_OFFSET                    NUMBER(38)              IN     DEFAULT
    SRC_OFFSET                     NUMBER(38)              IN     DEFAULT

  • What is the best big data solution for interactive queries of rows with up?

    0 down vote favorite
    We have a simple table such as follows:
    | Name | Attribute1 | Attribute2 | Attribute3 | ... | Attribute200 |
    | Name1 | Value1 | Value2 | null | ... | Value3 |
    | Name2 | null | Value4 | null | ... | Value5 |
    | Name3 | Value6 | null | Value7 | ... | null |
    | ... |
    But there could be up to hundreds of millions of rows/names. The data will be populated every hour or so.
    The goal is to get results for interactive queries on the data within a couple of seconds.
    Most queries look like:
    select count(*) from table
    where Attribute1 = Value1 and Attribute3 = Value3 and Attribute113 = Value113;
    The where clause contains arbitrary number of attribute name-value pairs.
    I'm new in big data and wondering what the best option is in terms of data store (MySQL, HBase, Cassandra, etc) and processing engine (Hadoop, Drill, Storm, etc) for interactive queries like above.

    Hi,
    As always, the correct answer is "it depends".
    - Will there be more reads (queries) or writes (INSERTs)?
    - Will there be any UPDATEs?
    - Does the use case require any of the ACID guarantees, or would "eventual consistency" be fine?
    At first glance, Hadoop (HDFS + MapReduce) doesn't look like a viable option, since you require "interactive queries". Also, if you require any level of ACID guarantees or UPDATE capabilities the best (and arguably only) solution is a RDBMS. Also, keep in mind that Millions of rows is pocket change for modern RDBMSs on average hardware.
    On the other hand, if there'll be a lot more queries than inserts, VERY few or no updates at all, and eventual consistency will not be a problem, I'd probably recommend you to test a Key-Value store (such as Oracle NoSQL Database). The idea would be to use (AttributeX,ValueY) as the Key, and a Sorted List of Names that have ValueY for their AttributeX. This way you only do as many reads as attributes you have in the WHERE clause, and then compute the intersection (very easy and fast with sorted lists).
    Also, I'd do this computation manually. SQL may be comfortable, but I don't think It's Big Data ready yet (unless you chose the RDBMS way, of course).
    I hope it helped,
    Joan
    Edited by: JPuig on Apr 23, 2013 1:45 AM

  • Big Data

    Really , this new technology could not have another name . In Portuguese , in a literal translation , can we say big data , or even analytic function of a large volume of data , structured or unstructured , which are determined by sound, images , numbers and even personalities , guys . This analytic function , which is a statistical function can determine trends for a given sequence of actions on the internet . For example , in my case , in a certain period of time , I created a sequence of groups with the same configuration and the same format in use. Therefore, the analysis may indicate a tendency to create new groups with the same goal . Soon after , this analysis indicates the creation of a page as an author and a daily event . Thus , including the analysis of texts and interactions of tastes and commitments of the texts , we can come to a conclusion : this guy writes every day.
    Well, so far we see that whosoever will be effective in the use of this new technology , analysis tool , should have basic knowledge of statistics, just math .
    So here comes the question : professionals in the exact sciences will be ahead in this technology field ? Or human issue , referring to the tastes and engagements also involve professionals in the humanities ?
    I understand that without the two views , the analysis will be distorted .
    With the two views we can say that in my case , the analysis may indicate that social groups , are actually short stories written in the social network , also called " social books " ( " social book " in English ) .
    Recommendation : Spot staff, learn to hear about human feelings , human and personal , learn to such statistics.
    So you can collaborate , work together as a team .

    975791 wrote:
    Hi All,
    I have around few years experience ( L2  level) in Oracle DBA.I recently joined in a company.
    You realize, of course, that "L2 level" is not an industry-wide term an means nothing outside of your own organization's usage.
    When I first started in this business (1981) my company had 3 "levels' of "programmer-analys" -- PA-1, PA-2, and PA-3.  The PA-1 was the most junior and the PA-3 was the most senior.  In my next job they had exactly the same job titles.  But there PA-1 was most senior and PA-3 was most junior.
    "When I use a word," Humpty Dumpty said in rather a scornful tone, "it means just what I choose it to mean -- neither more nor less."
    (Lewis Carroll - Through the Looking Glass)
    They asked to study Big Data.Could you please share about Big data Future in IT industry.
    You do realize, of course, that "Big Data" is just a concept, little more than a marketing term.  It does not necessarily refer to anything that is in conflict with your oracle experience.  I
    I  hesitate to learn Big Data because i spent a lot of  time/energy in oracle dba.
    Why do you think learning MORE will negate what you've already learned?  The more you know about anything, the more you will know about everything, and the more valuable you will be.  There's no such thing as wasted education.
    Share about Oracle dba Future. Because Oracle  automated everything in database.
    No they have not "automated everything in database".  They have not automated intelligence into design.  They have not automated intelligence in problem solving.  Sure, the database doesn't require as much day-to-day hand-holding as it used to, but that just frees you up for more intelligent work.
    Please anyone advise me.
    Don't expect your career 30 years from now to be the same as it is today.
    Don't expect technology 30 years from now to be the same as it is today.

  • Big data and database administration

    Hi,
    I am working as a Oracle DBA. I would like to know what is dba role for Big data & Nosql.
    Is it really useful for learning bigdata.
    Thanks,

    . Are
    there any relationship between these two fields?You are comparing cheese with chalk.
    how
    I can learn more about the data wherehousing?Start with Oracle doc,
    Oracle® Database Data Warehousing Guide
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14223/toc.htm

  • BIG Data with OBIEE

    What is Big Data? How to handle that in OBIEE?
    Please give me the steps how to implement that.
    Thanks.

    Hi,
    Not sure what you want to achieve here... however.. what I read is that you would like to have a way to get unstructured data into structured so you can report on it using OBIEE. (Correct?).
    Depending on the amount of data and the things you need to do to it before you can structure it you can create a map/reduce function and use Hadoop. There is a connector to push the hadoop results into the Oracle database. Then you could use OBIEE to report on the now structured data in the Oracle database.
    As I do not know the details of what you like to achieve,..... this is a solution I would start looking into if I where you.
    mark as helpful if answered

  • What is Big data all  about .. ?

    Hai to all
    May i know what is the Big data all about ? ... how can i install and learn ?
    is it good demand in market and is there any good pay for that ? in that market am from india... currently am working in Oracle Incentive compensation... my previous experience into siebel crm .....
    Please can u suggest me good things
    thanks

    Well, there are many big data initiatives and there is a lot of interest in the market. So, yes big data is an interesting career path.
    What is big data? Many things, but in general the "art" of collecting, correlating and understanding data - in massive volumes and at rapid ingest speeds - to drive businesses to better support their customers, their goals etc.
    How to get started? Look around the forums, but also on Linked-In where you see job postings. Two avenues to consider for learning:
    - Analytics, look at this Oracle link: http://www.oracle.com/technetwork/database/options/advanced-analytics/index.html
    - Hadoop and NoSQL database, look at the big data appliance page, and possibly take a class on Hadoop
    Hope that helps,
    JP

  • Certifications for "Appliances" & more Big Data certifications Please

    It seems Oracle is committing towards appliances . . . database appliance, big data appliance, Exadata and Exalytics In-Memory Machine.
    Will there be a shift in certification to match?
    For example there is a appliance certification out there already . . . the Exadata certification . . . so extending this idea there could (or should be) certifications related to how to support the other appliances
    I would like to see a certification related to Big Data . . . how about a NoSQL certification?

    910086 wrote:
    It seems Oracle is committing towards appliances . . . database appliance, big data appliance, Exadata and Exalytics In-Memory Machine.
    Will there be a shift in certification to match?
    For example there is a appliance certification out there already . . . the Exadata certification . . . so extending this idea there could (or should be) certifications related to how to support the other appliances
    I would like to see a certification related to Big Data . . . how about a NoSQL certification?On the certification website there is a Oracle Certified Database Cloud Administrator track published ( http://education.oracle.com/pls/web_prod-plq-dad/db_pages.getpage?page_id=458&get_params=p_track_id:CLOUDOCP) .... (some relevant exams in beta (exadata) or just about to go beta ) ... There is Oracle Cloud Application Foundation Certified Implementation Specialist ... the 12c DBA OCP Database upgrade exam (from 11g DBA OCP) is present but not bookable on Pearson VUE (last time I looked) ... waiting for 12c Database release (should be soon).
    There's some matching things on the systems side too: http://education.oracle.com/pls/web_prod-plq-dad/db_pages.getpage?page_id=632 (see under systems).
    If I goto the 'big data' technology home page on OTN (http://www.oracle.com/us/technologies/big-data/index.html) ... and quite frankly its the first time I've been there as far as I remember, I see a lot of technologies brought togther ... some are (relatively) old , some seem relatively new. Some have associated certifications already ... some do not ... and I suspect some never will (not everything has to have a certication).
    When a technology is in its infancy the developers, champions and first users will be getting to grips with it, learning it, developing it, learning how to appy it. When this has matured enough the expertise will also have developed enough to take stock and produce a certification.
    I am also aware Oracle are aware of the ability of a certification to incentivise candiates (and oracle partners) to get onto evaluating or adopting a particular technology or new features (overall this probably benefits both sides ... but it can mean a certification may be slightly skewed towards new features or aspects Oracle wishes to promote ... especially for OCS).

  • Is there any connectors between - OBIEE RPD & Big Data

    Is there any connector between - OBIEE RPD & Big Data? How we will get structured & unstructured data into OBIEE RPD to generate reports?

    Not sure what you want to achieve here... however.. what I read is that you would like to have a way to get unstructured data into structured so you can report on it using OBIEE. (Correct?).
    Depending on the amount of data and the things you need to do to it before you can structure it you can create a map/reduce function and use Hadoop. There is a connector to push the hadoop results into the Oracle database. Then you could use OBIEE to report on the now structured data in the Oracle database.
    As I do not know the details of what you like to achieve,..... this is a solution I would start looking into if I where you. :-)
    Regards,
    Johan Louwers

Maybe you are looking for

  • Connecting OS 9 Clients with kerberos

    Hi All- I am trying to get a couple of OS 9 boxes to connect to AFP on my 10.4.7 Server via Kerberos. I've got the kerberos app installed on the clients and i am able to obtain tickets, but niether the chooser nor the network browser uses this ticket

  • How to change font in SMPT mail

    How can I change the mailing font while i send the mail using oracle SMPT mailing utility. Thanks Vishal

  • Why won't my iPod 5 charge?

    I have only had this iPod for a year now and I never had a problem with it. I never dropped it so I don't see how that would be an issue. It was charging fine one second, and the next it wasn't. I don't believe it is the charger because I have tried

  • Importing old albums and photo data from iPhoto 5 to iPhoto6

    Hi- I recently got a Macbook Pro to replace my Powerbook G4 whose hard drive died. Luckily before it totally died, I got my iPhoto Library folder backed up on an external hard drive. I have all the album data files and library files as well as all th

  • Load Balancing only working on one server!

    Hi All, I have configured 3 servers as follows: Server 1 = WebCache only server (Windows 2003) Server 2 = 10g Forms and Reports server (Solaris) Server 3 = 10g Forms and Reports server (Solaris) Servers 2 and 3 work perfectly using the direct address