Displaying 20 results from an estimated 200 matches similar to: "SparksR"
2018 May 07
0
Discovering patterns in textual strings
Bert
Here are some examples of the type of text strings I?m dealing with:
??????.??.???
??????.??.??????????
?Torrent? Pro - Torrent App
?Torrent?-Torrent Downloader
1 Pic 8 Words - Syllables
1 Pic 8 Words - Syllables
27043_Spanish songs for children
28.android.com.alpha.horoscope
28.android.com.bravo.horoscope
28.Card Game - Offline
28.card Game Multiplayer
37045_Spanish songs
2017 Jan 04
3
Big data con R
Hola.
Últimamente ha habido en la lista varios hilos sobre análisis de grandes
volúmenes de datos con R.
Las alternativas que se han mencionado son:
- Usar una máquina más potente, vía Amazon Web Services, por ejemplo
- Paralelización con openMp
- h2o y su paquete para R,
- Paquete sparklyr como wrapper de los algoritmos de spark,
Y por supuesto, utilizar muestreo o incluso si tenemos
2018 May 05
1
Discovering patterns in textual strings
"Does that help?"
No. I am not your private consultant. You need to reply to the list, which
I have cc'ed here, not just me.
I am still somewhat confused by your specifications, but others may not be.
Part of my confusion stems from your failure to provide a reproducible
example (see e.g. the posting guide linked below). For example, I cannot
tell from your text whether the Abc
2015 Sep 01
0
lazy loading in SparkR
Hi,
I'm using SparkR and R won't read the promises from the SparkR package,
only if I run lazyLoad manually.
.libPaths(c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib"), .libPaths()))
print(.libPaths())
# [1] "/private/tmp/spark-1.5/spark-1.5.0-SNAPSHOT-bin-hadoop2.6/R/lib"
# [2] "/usr/local/lib/R/3.2/site-library"
# [3]
2017 Sep 04
1
Suggestion: Create On-Disk Dataframes
On 4 September 2017 at 11:35, Suzen, Mehmet wrote:
| It is not needed. There is a large community of developer using SparkR.
| https://spark.apache.org/docs/latest/sparkr.html
| It does exactly what you want.
I hope you are not going to mail a sparkr commercial to this list every day.
As the count is now at two, this may be an excellent good time to stop it.
Dirk
--
2023 May 16
1
Recombining Mon and Year values
?s 21:29 de 16/05/2023, Jeff Reichman escreveu:
> R Help
>
>
>
> I have a data.frame where I've broken out the year <dbl> and an ordered
> month <ord> values. But I need to recombine them so I can graph mon-year in
> order but when I recombine I lose the month order and the results are
> plotted alphabetical.
>
>
>
> Year month
2017 Sep 04
0
Suggestion: Create On-Disk Dataframes
It is not needed. There is a large community of developer using SparkR.
https://spark.apache.org/docs/latest/sparkr.html
It does exactly what you want.
On 3 September 2017 at 20:38, Juan Telleria <jtelleriar at gmail.com> wrote:
> Dear R Developers,
>
> I would like to suggest the creation of a new S4 object class for On-Disk
> data.frames which do not fit in RAM memory, which
2017 Nov 08
0
Help Converting Calendars
How about
> p_dates <- paste0(p.dates[[3]], "-", p.dates[[2]], "-", p.dates[[1]])
> myData$p_dates <- p_dates
> print(myData, right=FALSE)
dates p_dates
1 2017-10-01 1396-7-9
2 2017-10-02 1396-7-10
3 2017-10-03 1396-7-11
> str(myData)
'data.frame': 3 obs. of 2 variables:
$ dates : Date, format: "2017-10-01"
2023 May 16
3
Recombining Mon and Year values
R Help
I have a data.frame where I've broken out the year <dbl> and an ordered
month <ord> values. But I need to recombine them so I can graph mon-year in
order but when I recombine I lose the month order and the results are
plotted alphabetical.
Year month mon_year
<dbl> <ord>
2021 Mar Mar-2021
2021 Jan
2017 Oct 04
2
Leer parquet files desde R
Buenas a todos.
Ya sé que con sparkR o sparklyr puedo leer fácilmente ficheros con formato
parquet, pero ¿hay alguna forma de leerlos sin tener que arrancar spark?
Mi situación es que tengo unos ficheros en formato parquet en s3 y quiero
leerlos desde una instancia pequeñita de amazon EC2 que quiero mantener sin
instalarle spark.
Estoy bicheando la librería https://github.com/cloudyr/aws.s3 y va
2018 May 04
4
Discovering patterns in textual strings
R Help Forum
Is there a R library (or a way) that I can extract unique character strings,
or repeating patterns in textual strings. Say for example I have the
following records:
Abc_1234_kjhksh_276
Abc
Abc_1234_lakdofyo_324
Bce_876_skdhk_*&^%*&
Bce
Bce_454
And I would like to see the following results
Abc
Abc_1234
Bce
Jeff Reichman
[[alternative HTML version
2017 Nov 08
2
Help Converting Calendars
R-Help
Trying to convert a Gregorian calendar dataset to a Persian calendar
dataset. But I end up with a list and not sure what to do. For example ...
dates <- c("2017-10-1","2017-10-2","2017-10-3")
myData <- data.frame(dates)
myData$dates <- as.Date(myData$dates, format = "%Y-%m-%d")
> myData
dates
1 2017-10-01
2 2017-10-02
3
2018 Apr 12
3
Bivariate Normal Distribution Plots
R-Help
I am attempting to create a series of bivariate normal distributions. So using the mvtnorm library I have created the following code ...
# Standard deviations and correlation
sig_x <- 1
sig_y <- 1
rho_xy <- 0.0
# Covariance between X and Y
sig_xy <- rho_xy * sig_x *sig_y
# Covariance matrix
Sigma_xy <- matrix(c(sig_x ^ 2, sig_xy, sig_xy, sig_y ^ 2), nrow = 2, ncol = 2)
2018 Mar 13
2
Understanding TS objects
R Help Community
I'm trying to understand time series (TS) objects. Thought I understood but recently have run into a series of error messages that I'm not sure how to handle. I have 15 years of quarterly data and I typically create a TS object via something like...
data.ts <- ts(mydata, start = 2002, frequency = 4)
this create a matric as opposed to a vector object as I receive a
2017 Oct 04
2
Leer parquet files desde R
Hola Carlos.
spark_read_parquet es de sparklyr y necesita un sparkcontext inicializado
para leer el fichero de parquet.
El mié., 4 oct. 2017 22:11, Carlos Ortega <cof en qualityexcellence.es>
escribió:
> Hola José Luis,
>
> ¿Has probado directamente con "dplyr"?...
>
> spark_read_parquet
>
2020 Oct 01
0
summarize_all Function
Hello,
Any of the two will do, the first is now preferred.
library(dplyr)
mtcars %>%
summarise(across(everything(), sum))
mtcars %>%
summarise_all(sum) # no need for `funs()`
Hope this helps,
Rui Barradas
?s 18:29 de 01/10/20, Jeff Reichman escreveu:
> r-help Forum
>
>
>
> I'm using the dplyr:: summarize_all(funs(sum)) function and am receiving a
>
2020 Oct 01
0
summarize_all Function
The warning gives some suggestions. E.g., replace funs(sum,prod) with
list(sum=sum,prod=prod).
% R CMD Rscript -e 'library(dplyr,warn.conflicts=FALSE);
data.frame(X=1:3,Y=c(11,13,17)) %>% summarize_all(funs(sum,prod))'
X_sum Y_sum X_prod Y_prod
1 6 41 6 2431
Warning message:
`funs()` is deprecated as of dplyr 0.8.0.
Please use a list of either functions or lambdas:
2018 May 26
0
Grouping by 3 variable and renaming groups
Hello,
See if this is it:
priceStore_Grps$StoreID <- paste("Store",
seq_len(nrow(priceStore_Grps)), sep = "_")
Hope this helps,
Rui Barradas
On 5/26/2018 2:03 PM, Jeff Reichman wrote:
> ALCON
>
>
>
> I'm trying to figure out how to rename groups in a data frame after groups
> by selected variabels. I am using the dplyr library to group my
2018 May 26
1
Grouping by 3 variable and renaming groups
Hello,
Sorry, but I think my first answer is wrong.
You probably want something along the lines of
sp <- split(priceStore_Grps, priceStore_Grps$StorePC)
res <- lapply(seq_along(sp), function(i){
sp[[i]]$StoreID <- paste("Store", i, sep = "_")
sp[[i]]
})
res <- do.call(rbind, res)
row.names(res) <- NULL
Hope this helps,
Rui Barradas
On 5/26/2018
2015 May 06
2
Resumen de R-help-es, Vol 75, Envío 7
Hola, me sorprende leer tu opinión ("R (puro) no es la herramienta ideal para el manejo directo del 'big data'") cuando precisamente este pasado mes de abril SparkR (ver descripción de su web más abajo) se ha integrado en Apache Spark y todo el mundo que está en "ese ajo" del "big data" (buzzword donde las haya) no le quita ojo a la publicación oficial este