Big Data

Really , this new technology could not have another name . In Portuguese , in a literal translation , can we say big data , or even analytic function of a large volume of data , structured or unstructured , which are determined by sound, images , numbers and even personalities , guys . This analytic function , which is a statistical function can determine trends for a given sequence of actions on the internet . For example , in my case , in a certain period of time , I created a sequence of groups with the same configuration and the same format in use. Therefore, the analysis may indicate a tendency to create new groups with the same goal . Soon after , this analysis indicates the creation of a page as an author and a daily event . Thus , including the analysis of texts and interactions of tastes and commitments of the texts , we can come to a conclusion : this guy writes every day.
Well, so far we see that whosoever will be effective in the use of this new technology , analysis tool , should have basic knowledge of statistics, just math .
So here comes the question : professionals in the exact sciences will be ahead in this technology field ? Or human issue , referring to the tastes and engagements also involve professionals in the humanities ?
I understand that without the two views , the analysis will be distorted .
With the two views we can say that in my case , the analysis may indicate that social groups , are actually short stories written in the social network , also called " social books " ( " social book " in English ) .
Recommendation : Spot staff, learn to hear about human feelings , human and personal , learn to such statistics.
So you can collaborate , work together as a team .

975791 wrote:
Hi All,
I have around few years experience ( L2  level) in Oracle DBA.I recently joined in a company.
You realize, of course, that "L2 level" is not an industry-wide term an means nothing outside of your own organization's usage.
When I first started in this business (1981) my company had 3 "levels' of "programmer-analys" -- PA-1, PA-2, and PA-3.  The PA-1 was the most junior and the PA-3 was the most senior.  In my next job they had exactly the same job titles.  But there PA-1 was most senior and PA-3 was most junior.
"When I use a word," Humpty Dumpty said in rather a scornful tone, "it means just what I choose it to mean -- neither more nor less."
(Lewis Carroll - Through the Looking Glass)
They asked to study Big Data.Could you please share about Big data Future in IT industry.
You do realize, of course, that "Big Data" is just a concept, little more than a marketing term.  It does not necessarily refer to anything that is in conflict with your oracle experience.  I
I  hesitate to learn Big Data because i spent a lot of  time/energy in oracle dba.
Why do you think learning MORE will negate what you've already learned?  The more you know about anything, the more you will know about everything, and the more valuable you will be.  There's no such thing as wasted education.
Share about Oracle dba Future. Because Oracle  automated everything in database.
No they have not "automated everything in database".  They have not automated intelligence into design.  They have not automated intelligence in problem solving.  Sure, the database doesn't require as much day-to-day hand-holding as it used to, but that just frees you up for more intelligent work.
Please anyone advise me.
Don't expect your career 30 years from now to be the same as it is today.
Don't expect technology 30 years from now to be the same as it is today.

Similar Messages

  • I am using the big date calendar template and when I submit it to apple for printing I lose the name of two months. These names are not text boxes. I see the names when I send it in but something happens during the transmission to apple. It was suggested

    I am using the big date calendar template in iPhoto. I am on Lion 10.7.2, macbook air. The names of the months are on each calendar page but something happens when I send the data to Apple. The names are part of the template. They are not text boxes. I lose two names on the calendar after it is sent to Apple. Apple suggested I make a pdf file of my calendar before sending it in and check to make sure every name shows. I did this with a calendar I just sent in. The calendar was correct. All names of the months were showing. After sending the data two month names disappeard because when it arrived by mail, it was incorrect. Apple looked at my calendar via a pdf file and it was incorrect.  This is second time this has happened. I called Apple and they had me delete several folders in the Library folder, some preferences and do a complete reinstall of iPhoto.  I have not yet remade the defective calendar. I am wondering if anyone else has had this problem?
    kathy

    Control-click on the background of the view all pages window and select "Preview Calendar" from the contextual menu.
    You can also save the pdf as a file to compare to the printed calendar.  If the two names are visible in the pdf file then the printed copy should show them.  Contact Apple for a refund.  Apple Print Products - Apple Store (U.S.)

  • Best strategy to upload a big data file and then to insert in db the content

    Hi,
    Here's our requirement. We have a web business application developed on JSF 1.2, JE66, WebLogic for application server, and Oracle for the data back end tier. We need to upload big data files (80 to 100 Mb) from a web page, and persist the content in database tables.
    What's the best way in terms of performance to implement this use case ? Once the file is uploaded on server a command button is available on the web page  to trigger a JSF controller action in order to save data in database.
    Actually we plan to keep in memory the content of the http request, and call insert line on each line of the file. But i think it's bad and not scalable.
    Is is better to write the file on server's disk and then use multiple threads to send the lines to the database ? How to use multi threading in a JSF managed bean ?
    Thanks

    In addition, LoadFromFile is overloaded to handle both BLOB and CLOB:
    PROCEDURE LOADFROMFILE
    Argument Name                  Type                    In/Out Default?
    DEST_LOB                       BLOB                    IN/OUT
    SRC_LOB                        BINARY FILE LOB         IN
    AMOUNT                         NUMBER(38)              IN
    DEST_OFFSET                    NUMBER(38)              IN     DEFAULT
    SRC_OFFSET                     NUMBER(38)              IN     DEFAULT
    <BR>
    PROCEDURE LOADFROMFILE
    Argument Name                  Type                    In/Out Default?
    DEST_LOB                       CLOB                    IN/OUT
    SRC_LOB                        BINARY FILE LOB         IN
    AMOUNT                         NUMBER(38)              IN
    DEST_OFFSET                    NUMBER(38)              IN     DEFAULT
    SRC_OFFSET                     NUMBER(38)              IN     DEFAULT

  • Upcoming Lumira Webinar June 11th on the topic of "Big Data Visualization and Custom Extensions"

    The next webinar in the Lumira series is coming up this week on Wednesday, June 11th, 10:00 AM - 11:00 AM Pacific Standard Time. The day is almost here and if you've already registered, thank you!
    If not, please click here to register.
    Speaker Profile:
    The topic is presented by the Jay Thoden van Velzen, Program Director Global HANA Services/Big Data Services Center of Excellence at SAP!
    Jay has been working in Analytics/Business Intelligence since it was called Decision Support Systems in the late 90s. Currently he is focused on Big Data solutions and how to make the various components of such a solution run smoothly integrated together using the SAP HANA Platform. 
    Abstract:
    Big Data analysis poses unique and new challenges to data visualization, compared to more traditional analytics. Such analysis often includes frequency counts, analysis of relationships in a network, and elements of statistical and predictive modeling. In many cases, traditional visualization techniques of bar- and column charts, pie charts and line graphs are not the most appropriate. We have to avoid the “beautiful hairball” and make it easy for end users to absorb the information through clever use of filtering, transparency and interactivity. We will likely also need to provide more context to go with the visualization than we have been used to in traditional analytics. Moreover, in case of forecasts you need to include any confidence intervals in order not to mislead.
    This means we need more chart types, and often the chart types you need may not exist, nor could the need for such chart types necessarily be foreseen. However, Lumira allows us to design and code our own D3.js visualizations and integrate it into Lumira while providing all the data access methods – including SAP HANA – that it provides out of the box. This means we can develop our visualizations to share the outcomes of Big Data analysis to exactly how we feel it should be presented. During the webinar we will show a number of examples, and specifically the integration of forecasts coming from R into Lumira through a Lumira custom extension.
    We really hope to see you there!
    Cheers!
    Customer Experience Group

    Congrats to Joao and Alex!
     Microsoft Azure Technical Guru - May 2014  
    João Sousa
    Microsoft Azure - Remote Debbuging How To?
    GO: "Clever. Well Explained and written. Thanks! You absolutely deserve the GOLD medal."
    Ed Price: "Fantastic topic and great use of images!"
    Alex Mang
    The Move to the New Azure SQL Database Tiers
    Ed Price: "Great depth and descriptions! Very timely topic! Lots of collaboration on this article from community members!"
    GO: "great article but images are missing"
    Alex Mang
    Separating Insights Data In Visual Studio Online
    Application Insights For Production And Staging Cloud Services
    Ed Price: "Good descriptions and clarity!"
    GO: "great article but images are missing"
    Ed Price, Power BI & SQL Server Customer Program Manager (Blog,
    Small Basic,
    Wiki Ninjas,
    Wiki)
    Answer an interesting question?
    Create a wiki article about it!

  • Best practices for administering Oracle Big Data Appliance

    -        Best practices as part of administration of Oracle Big Data Infrastructure
    -        How do we lock down max space usage per project
    Eg: Project team A can have a max limit of 10 TB space allocated
    -        Restricting roles, access ( Read, Write), place holder for common shared artifacts
    -        Template/procedure for code migration across dev,qa and prod environments etc

    Your data is bigger than I run, but what I have done in the past is to restrict their accounts to a separate datafile and limit its size to the max that I want for them to use: create objects restricted to accommodate the location.

  • What is the best big data solution for interactive queries of rows with up?

    0 down vote favorite
    We have a simple table such as follows:
    | Name | Attribute1 | Attribute2 | Attribute3 | ... | Attribute200 |
    | Name1 | Value1 | Value2 | null | ... | Value3 |
    | Name2 | null | Value4 | null | ... | Value5 |
    | Name3 | Value6 | null | Value7 | ... | null |
    | ... |
    But there could be up to hundreds of millions of rows/names. The data will be populated every hour or so.
    The goal is to get results for interactive queries on the data within a couple of seconds.
    Most queries look like:
    select count(*) from table
    where Attribute1 = Value1 and Attribute3 = Value3 and Attribute113 = Value113;
    The where clause contains arbitrary number of attribute name-value pairs.
    I'm new in big data and wondering what the best option is in terms of data store (MySQL, HBase, Cassandra, etc) and processing engine (Hadoop, Drill, Storm, etc) for interactive queries like above.

    Hi,
    As always, the correct answer is "it depends".
    - Will there be more reads (queries) or writes (INSERTs)?
    - Will there be any UPDATEs?
    - Does the use case require any of the ACID guarantees, or would "eventual consistency" be fine?
    At first glance, Hadoop (HDFS + MapReduce) doesn't look like a viable option, since you require "interactive queries". Also, if you require any level of ACID guarantees or UPDATE capabilities the best (and arguably only) solution is a RDBMS. Also, keep in mind that Millions of rows is pocket change for modern RDBMSs on average hardware.
    On the other hand, if there'll be a lot more queries than inserts, VERY few or no updates at all, and eventual consistency will not be a problem, I'd probably recommend you to test a Key-Value store (such as Oracle NoSQL Database). The idea would be to use (AttributeX,ValueY) as the Key, and a Sorted List of Names that have ValueY for their AttributeX. This way you only do as many reads as attributes you have in the WHERE clause, and then compute the intersection (very easy and fast with sorted lists).
    Also, I'd do this computation manually. SQL may be comfortable, but I don't think It's Big Data ready yet (unless you chose the RDBMS way, of course).
    I hope it helped,
    Joan
    Edited by: JPuig on Apr 23, 2013 1:45 AM

  • Oracle Business Intelligence with big data

    Has anyone implemented OBIEE 10g utilizing a denormalized data model for a very large transactional data set? There is a potential to generate reports in the 100s of millions of rows, with the data warehouse storing fact tables that have ~1.5 billion rows with a data consumption rate of over 200GB per month.
    Does anyone have any best practices, tips, or advice to determine the feasibility of implementing OBIEE 10g for such a large volume? The data is transactional and there are no current requirements for aggregate data sets. Is it feasible to use OBIEE Answers to generate these types of reports? Thus far I've seen OBIEE Answers hanging/crashing on reports that are > 10MB in size. Any configuration tips or feedback would be appreciated.
    Thanks,
    John

    I think with Big Data environment you need not worry about caching, runtime aggregation and processing, if your configuration is right. The hardware along with distributed processing would take care of most of the load since Big Data database are designed to be in-memory and highly responsive.
    The thing you should consider that final output should not be too large. What I mean is, your request can process 100 million rows on database, but final output can be 1000 rows, since BI is all about summarization.
    If you have large data thrown to the Presentation Server, it could make presentation service unstable.

  • Configuring timecapsule to fix-ip and restrict big data usage

    Looking to buying time capsule 3TB for my department.
    I have about 20 Staff, within them a few macs but primarilty windows 7.
    I was wondering if i can set up the time capsule to allow fixed ip-address only, and even possibly restrict the internet usage, to prevent my staff from downloading big data??
    I know my company uses a firewall and basic networking switches, can this affect time capsule as it will be connected by ethernet to a network switch?

    I was wondering if i can set up the time capsule to allow fixed ip-address only
    Yes, but you will need to have the MAC Address of each wireless device to set this up. This will only work on wireless devices, not any devices that are connected using Ethernet.
    and even possibly restrict the internet usage, to prevent my staff from downloading big data??
    The Time Capsule does not have this type of capability. Look at appropriate Cisco or Netgear products if you need this feature.
    I know my company uses a firewall and basic networking switches, can this affect time capsule as it will be connected by ethernet to a network switch?
    The Time Capsule would need to be configured in Bridge Mode to operate correctly on the network.

  • I'm using Big date theme but the numbers on the date is cut off on top.  I'm on iphoto 11 and OS 10.8.2.  anyone else having this issue??

    I'm using Big date theme but the numbers on the date is cut off on top.  I'm on iphoto 11 and OS 10.8.2.  anyone else having this issue??

    OK try the following: make a temporary, backup copy (if you don't already have a backup copy) of the library and try the following:
    1 - delete the iPhoto preference file, com.apple.iPhoto.plist, that resides in your
         User/Home/Library/ Preferences folder.
    2 - delete iPhoto's cache file, Cache.db, that is located in your
    User/Home/Library/Caches/com.apple.iPhoto folder (Snow Leopard and Earlier).
    or with Mt. Lion from the User/Library/Containers/com.apple.iPhoto/
    Data/Library/Caches/com.apple.iPhoto folder
    3 - launch iPhoto and try again.
    NOTE: If you're moved your library from its default location in your Home/Pictures folder you will have to point iPhoto to its new location when you next open iPhoto by holding down the Option key when launching iPhoto.  You'll also have to reset the iPhoto's various preferences.
    NOTE 2:  In Lion and Mountain Lion the Library folder is now invisible. To make it permanently visible enter the following in the Terminal application window: chflags nohidden ~/Library and hit the Enter button - 10.7: Un-hide the User Library folder.

  • Strategy for big data

    Dear experts,
    Currently i'm facing Big Data problem. We have an about 1TB transaction record for Per Month.
    Now I'm trying to create Data Marts for that. And Install Obiee. What is the Strategy And Steps?
    Please Advice...
    BR,
    Eba

    Denis,
    In this case you can do it two ways.
    1. Proxies - You will have to develop a custom report which will collect all the data that needs to be sent and call the PROXY will the collected as input.
    2. IDOCs - If you are dealing with standard IDOCS, this is easier. You can activate the configuration to send the IDOCS for contracts for all the operations that you have mentioned. Do the required outbound configuration in WE20 to mention the target system as XI.
    I am not sure why are you even thinking of scheduling a BPM in XI that will invoke the RFC. SAP as such has got the scheduling capabilities. I would rather suggest you to use that.
    Regards,
    Ravi

  • Big data and database administration

    Hi,
    I am working as a Oracle DBA. I would like to know what is dba role for Big data & Nosql.
    Is it really useful for learning bigdata.
    Thanks,

    . Are
    there any relationship between these two fields?You are comparing cheese with chalk.
    how
    I can learn more about the data wherehousing?Start with Oracle doc,
    Oracle® Database Data Warehousing Guide
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14223/toc.htm

  • Big Data Shakes Up Credit Scoring

    News Source: Money's Edge (07/23/2015) Article Title:   Big Data Shakes Up Credit Scoring http://www.moneysedge.com/newsimage?id=198 

    JustMe3 wrote:
    I think it's kind of scary how far they are going to dig up information on us. I mean I understand lenders want to make sure they are reducing risk, but looking at our Facebook friends in order to make a decision seems a bit out there to me. Sure, birds of a feather flock together, but come on; I don't even know half of the people on my Facebook personally, just from networking. Big data scares me sometimes.Good thing I don't have facebook, twitter, linkedin or any other social networking. I knew it'd pay off at some point.

  • Big Data example

    Hi all, 
    I hear the term big data for quite some time now...
    Whenever I look on the web I only find infrastructure explanation...
    What does it mean in terms of T-Sql coding? structural storage on sql server (are tables used?)? Is there a "Hello World" example for big data.
    Sorry if my question is a bit wierd but that's how I've always started learning any new programming language or API.
    Thanks in advance, 
    Dror

    Hi,
    what is big data?
    from wiki
    Big data is an all-encompassing term for any collection of dataset so large and complex that it becomes difficult to process using on-hand data management tools or traditional data processing applications.
    http://en.wikipedia.org/wiki/Big_data
    What is Hadoop?
    Hadoop is designed to efficiently process large volumes of information by connecting many commodity computers together to work in parallel
    https://developer.yahoo.com/hadoop/tutorial/
    Hadoop distribution for Microsoft
    http://hortonworks.com/partner/microsoft/
    Microsoft PDW
    http://gnanadurai.blogspot.in/

  • Big Data is a Big Drag? | Metrics Not Myths | Adobe TV

    Companies talk about Big Data a lot right now. So what is it? Can we manage it? Join the debate on Facebook (bit.ly/MetricsNotMyths) and on Twitter #MetricsNotMyths. Follow @AdobeMktgCloud
    http://adobe.ly/TIK1zU

    Big data.  A well deserved discussion. 

  • Big Data Lite ver 3.0 cannot launch movieplex demo

    I downloaded the big data lite ver 3.0 virtual machine. Successfully imported -->  .ova and login using oracle into linux KDE.
    I clicked on firefox "Start Here". Then clicked "http://localhost:7001/movieplex/index.jsp" . Immediately I got the firefox error
    Firefox cant' establish a connection to the server at localhost:7001.
    I googled and not much information was found. I am not sure if there's any java error in setDomain.Env.sh where some posts indicated. I followed and modified but still couldn't get the website up. I am not into the labs exercise yet. Just trying to attempt to run demo and already hit into an error.
    Can anyone help please?

    Did anyone get this working for Big Data Lite VM?
    I was getting the Jackson NoSuchMethod... error, but I was able (I think) to get aroiund it by downloading newer versions of the Jackson jars.
    After I click Sign In all I get is the text: oracle.kv.impl.api.KVStoreImpl@<a_value_which_changes_every_clikc>
    Should I just give up an load 2.5? Is it more stable?
    How deep do the errors go? This is the third thing I have had to fix and I am wasting valuable time spinning my wheels.
    Thanks,
    matt

Maybe you are looking for

  • C309 won't scan to computer

    I can no longer  from my C309 to my laptop.  Starting the scan from the computer does not offer me a pdf option.  I have recently bought a new laptop (a Dell) after experiencing the same problem with my previous laptop (a Toshiba).  At first the scan

  • Error while running java proxy in PI 7.1

    Hi All, I had created a java proxy using nwds 7.1. This java proxy is called by XI channel and its synchronous. WHen i run my scenario in PI i am getting a error "Error while sending message: com.sap.engine.interfaces.messaging.api.exception.Messagin

  • IPod Nano Second Generation Freezes Every Morning

    I bought an 8G iPod Nano second generation about a month ago. I love it! I use it with Windows XP. Every morning when I turn it on for the first time, it freezes up when I play a Podcast or Audio book. It also does it with songs but not as much. I pu

  • Firefox crashes imediately...if i hit restart firefox, crash report opens again & again..etc

    Problem is only on on my toshiba laptop, windows 7 home 32 bit. (i think FF i was using is version 7 ?something) Noticed it after doing system restore wizzard, couple nights ago...NEVER had an issue w/firefox before. Have shut down & restarted 5-6 ti

  • Customer Pre Payment

    Hello All, Could someone please give me some advise on the following: Customers that place orders with us that elect to pay via EFT (Electronic Funds Transfer) initially have a sales order created and saved as a Draft. Once EFT is received the draft